Apr 24 21:14:21.024979 ip-10-0-128-142 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:14:21.024991 ip-10-0-128-142 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:14:21.024998 ip-10-0-128-142 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:14:21.025233 ip-10-0-128-142 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:14:31.154668 ip-10-0-128-142 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:14:31.154685 ip-10-0-128-142 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f32c493372354e9385077452104a2676 -- Apr 24 21:16:51.838171 ip-10-0-128-142 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:52.206951 ip-10-0-128-142 kubenswrapper[2560]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.206951 ip-10-0-128-142 kubenswrapper[2560]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:52.206951 ip-10-0-128-142 kubenswrapper[2560]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.206951 ip-10-0-128-142 kubenswrapper[2560]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:52.206951 ip-10-0-128-142 kubenswrapper[2560]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.208387 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.208304 2560 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:52.211006 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.210986 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.211006 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211000 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.211006 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211007 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211012 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211018 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211022 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211026 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211030 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211034 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211038 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211049 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211053 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211057 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211061 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211065 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211069 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211073 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211076 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211080 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211084 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211088 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.211191 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211092 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211096 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211100 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211104 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211108 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211113 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211117 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211121 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211125 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211128 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211133 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211137 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211142 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211146 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211150 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211156 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211161 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211166 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211170 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211174 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.211988 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211178 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211182 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211186 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211192 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211198 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211203 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211208 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211213 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211217 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211221 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211226 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211231 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211235 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211239 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211244 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211248 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211252 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211256 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211260 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.212656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211265 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211269 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211273 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211277 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211281 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211285 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211290 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211294 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211299 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211305 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211309 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211313 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211317 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211321 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211325 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211328 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211332 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211336 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211340 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211344 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.213155 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211350 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211354 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211358 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211363 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211366 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211370 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211956 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211965 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211969 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211973 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211977 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211981 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211986 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211990 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211994 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.211998 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212002 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212006 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212011 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212015 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.213632 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212019 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212023 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212027 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212031 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212035 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212039 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212043 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212047 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212051 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212055 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212059 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212063 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212066 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212070 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212074 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212079 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212083 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212087 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212091 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.214522 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212095 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212101 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212106 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212110 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212114 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212118 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212122 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212127 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212131 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212135 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212141 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212145 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212150 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212154 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212158 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212162 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212166 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212170 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212174 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.215243 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212178 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212182 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212186 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212190 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212194 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212198 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212202 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212206 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212212 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212218 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212224 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212228 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212232 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212236 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212241 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212245 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212250 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212256 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212260 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212265 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.215774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212269 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212273 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212278 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212282 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212286 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212290 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212294 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212298 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212302 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212306 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212310 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212315 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212319 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.212323 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213628 2560 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213649 2560 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213658 2560 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213665 2560 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213672 2560 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213677 2560 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213683 2560 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:52.216316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213690 2560 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213695 2560 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213700 2560 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213706 2560 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213711 2560 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213716 2560 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213721 2560 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213726 2560 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213730 2560 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213735 2560 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213740 2560 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213744 2560 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213752 2560 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213757 2560 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213762 2560 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213766 2560 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213771 2560 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213777 2560 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213783 2560 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213789 2560 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213794 2560 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213799 2560 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213803 2560 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213808 2560 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213812 2560 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:52.216973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213817 2560 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213823 2560 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213828 2560 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213833 2560 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213838 2560 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213842 2560 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213847 2560 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213853 2560 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213859 2560 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213863 2560 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213868 2560 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213873 2560 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213879 2560 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213884 2560 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213889 2560 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213893 2560 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213898 2560 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213903 2560 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213907 2560 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213912 2560 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213917 2560 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213935 2560 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213940 2560 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213946 2560 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213951 2560 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:52.217627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213955 2560 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213961 2560 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213966 2560 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213972 2560 flags.go:64] FLAG: --help="false" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213976 2560 flags.go:64] FLAG: --hostname-override="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213981 2560 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213987 2560 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213992 2560 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.213997 2560 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214002 2560 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214007 2560 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214011 2560 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214015 2560 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214019 2560 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214024 2560 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214029 2560 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214033 2560 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214038 2560 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214042 2560 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214046 2560 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214051 2560 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214056 2560 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214061 2560 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214066 2560 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:52.218359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214071 2560 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214080 2560 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214084 2560 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214088 2560 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214093 2560 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214098 2560 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214103 2560 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214107 2560 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214112 2560 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214119 2560 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214126 2560 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214133 2560 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214138 2560 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214143 2560 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214147 2560 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214152 2560 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214157 2560 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214161 2560 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214166 2560 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214179 2560 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214184 2560 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214189 2560 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214194 2560 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:52.218968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214201 2560 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214211 2560 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214216 2560 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214221 2560 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214225 2560 flags.go:64] FLAG: --port="10250" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214230 2560 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214235 2560 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02250900a1dde8dd9" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214243 2560 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214248 2560 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214253 2560 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214258 2560 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214262 2560 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214268 2560 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214273 2560 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214277 2560 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214281 2560 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214287 2560 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214292 2560 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214297 2560 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214302 2560 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214307 2560 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214312 2560 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214317 2560 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214321 2560 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214326 2560 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214330 2560 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:52.219528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214335 2560 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214340 2560 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214345 2560 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214349 2560 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214354 2560 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214358 2560 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214367 2560 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214372 2560 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214377 2560 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214382 2560 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214390 2560 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214395 2560 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214399 2560 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214407 2560 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214412 2560 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214416 2560 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214421 2560 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214426 2560 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214431 2560 flags.go:64] FLAG: --v="2" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214437 2560 flags.go:64] FLAG: --version="false" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214443 2560 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214449 2560 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.214455 2560 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214618 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214625 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.220151 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214630 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214634 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214640 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214645 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214650 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214654 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214659 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214663 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214667 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214671 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214675 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214679 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214683 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214689 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214693 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214698 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214702 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214706 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214710 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214714 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.220719 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214720 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214724 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214728 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214732 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214737 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214741 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214745 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214749 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214753 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214757 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214761 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214765 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214769 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214773 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214777 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214782 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214786 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214790 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214793 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.221318 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214810 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214815 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214819 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214823 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214827 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214831 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214836 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214840 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214844 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214849 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214854 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214858 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214862 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214868 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214872 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214876 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214881 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214885 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214889 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214893 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.221781 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214897 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214901 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214905 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214909 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214913 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214917 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214936 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214941 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214946 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214954 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214958 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214962 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214967 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214971 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214975 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214979 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214984 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.214991 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215000 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.222278 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215006 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215010 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215015 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215020 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215024 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.215028 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.222725 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.215574 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.222752 2560 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.222767 2560 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222821 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222826 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222831 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222836 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222839 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222842 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222845 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222848 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222850 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222853 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222855 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222858 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222860 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222863 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222866 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222868 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.222886 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222870 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222873 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222876 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222879 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222881 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222884 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222887 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222889 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222892 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222895 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222897 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222900 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222903 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222905 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222908 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222911 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222914 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222916 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222918 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222935 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.223362 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222940 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222942 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222945 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222948 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222950 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222953 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222955 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222959 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222962 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222965 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222968 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222970 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222973 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222975 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222979 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222982 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222985 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222988 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222990 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222993 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.223842 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222995 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.222998 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223001 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223003 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223005 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223008 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223010 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223013 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223028 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223030 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223033 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223035 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223038 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223041 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223043 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223046 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223049 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223051 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223053 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.224341 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223056 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223059 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223061 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223064 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223066 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223069 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223071 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223075 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223077 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223080 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223082 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.223088 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223179 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223184 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223187 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.224790 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223190 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223193 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223196 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223198 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223201 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223203 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223206 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223209 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223211 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223214 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223216 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223219 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223221 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223224 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223226 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223229 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223231 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223234 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223236 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.225167 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223238 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223241 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223243 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223246 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223248 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223251 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223254 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223257 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223259 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223262 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223266 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223269 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223272 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223275 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223277 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223280 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223282 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223284 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223288 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.225641 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223292 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223294 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223297 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223299 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223302 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223304 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223307 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223310 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223312 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223315 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223317 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223319 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223322 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223324 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223327 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223329 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223332 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223334 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223337 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223340 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.226138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223343 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223345 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223348 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223350 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223352 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223355 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223357 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223359 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223362 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223365 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223367 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223370 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223373 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223375 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223378 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223380 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223383 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223385 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223387 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223390 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223413 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.226596 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223417 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223420 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223423 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:52.223426 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.223430 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.224032 2560 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.225842 2560 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.226709 2560 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.226806 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:52.227115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.226846 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:52.247019 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.246997 2560 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:52.251621 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.251602 2560 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:52.265279 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.265255 2560 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:52.270543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.270524 2560 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:52.271706 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.271690 2560 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:52.274984 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.274966 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:52.275464 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.275442 2560 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e2e3e42d-9093-4a8b-8d80-bbec26874354:/dev/nvme0n1p3 f1883dd1-be98-4d11-b7cc-6ad85cbfbfb3:/dev/nvme0n1p4] Apr 24 21:16:52.275550 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.275463 2560 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:52.280758 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.280647 2560 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:52.279060679 +0000 UTC m=+0.337848341 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199930 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2acd650134cee5f875fdee96656795 SystemUUID:ec2acd65-0134-cee5-f875-fdee96656795 BootID:f32c4933-7235-4e93-8507-7452104a2676 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:34:48:a1:00:1d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:34:48:a1:00:1d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:c2:4a:6c:c6:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:52.280758 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.280746 2560 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:52.280903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.280833 2560 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:52.281242 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.281212 2560 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:52.281407 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.281242 2560 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:52.281482 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.281422 2560 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:52.281482 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.281437 2560 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:52.281482 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.281460 2560 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:52.282301 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.282289 2560 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:52.283734 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.283721 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:52.283859 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.283849 2560 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:52.286380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.286368 2560 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:52.286440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.286392 2560 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:52.286440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.286413 2560 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:52.286440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.286426 2560 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:52.286440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.286437 2560 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:52.287430 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.287417 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:52.287502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.287439 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:52.289937 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.289909 2560 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:52.292044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.292027 2560 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:52.293576 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293562 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:52.293660 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293584 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:52.293660 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293607 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:52.293660 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293618 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:52.293660 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293627 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:52.293823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293637 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:52.293872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293834 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:52.293872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293844 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:52.293872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293852 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:52.293872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293858 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:52.294001 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293882 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:52.294001 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.293890 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:52.295294 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.295282 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:52.295294 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.295292 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:52.298498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.298484 2560 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:52.298593 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.298518 2560 server.go:1295] "Started kubelet" Apr 24 21:16:52.298675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.298640 2560 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:52.298720 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.298656 2560 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:52.298720 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.298690 2560 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:52.299328 ip-10-0-128-142 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:52.299683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.299661 2560 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:52.300741 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.300728 2560 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:52.302768 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.302743 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:52.303493 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.303454 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:52.303561 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.303454 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:52.305896 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.305882 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:52.306199 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.305405 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c2af5826 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.298496038 +0000 UTC m=+0.357283699,LastTimestamp:2026-04-24 21:16:52.298496038 +0000 UTC m=+0.357283699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.306348 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.306331 2560 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:52.306955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.306844 2560 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:52.307215 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307197 2560 factory.go:55] Registering systemd factory Apr 24 21:16:52.307298 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307224 2560 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:52.307298 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.307224 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:16:52.307577 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307547 2560 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:52.307577 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307569 2560 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307592 2560 factory.go:153] Registering CRI-O factory Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307604 2560 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307655 2560 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307677 2560 factory.go:103] Registering Raw factory Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307692 2560 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307697 2560 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:52.307710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.307706 2560 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:52.308200 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.308182 2560 manager.go:319] Starting recovery of all containers Apr 24 21:16:52.311126 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.310958 2560 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:52.312260 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.312109 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zpk2d" Apr 24 21:16:52.312411 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.312372 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:16:52.312672 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.312635 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:52.320411 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.320397 2560 manager.go:324] Recovery completed Apr 24 21:16:52.321699 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.321683 2560 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:16:52.324472 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.324461 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.326701 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.326685 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.326757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.326711 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.326757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.326721 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.327189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.327174 2560 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:52.327189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.327188 2560 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:52.327261 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.327204 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:52.328640 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.328581 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.329269 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.329258 2560 policy_none.go:49] "None policy: Start" Apr 24 21:16:52.329327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.329273 2560 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:52.329327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.329283 2560 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:52.337372 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.337315 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.347189 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.347111 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365057 2560 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.365079 2560 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365087 2560 server.go:85] "Starting device plugin registration server" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365257 2560 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365267 2560 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365373 2560 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365465 2560 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.365474 2560 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.366069 2560 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:52.378498 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.366097 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:16:52.395583 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.395525 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c7ea4ae9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.386245353 +0000 UTC m=+0.445033006,LastTimestamp:2026-04-24 21:16:52.386245353 +0000 UTC m=+0.445033006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.439286 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.439252 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:52.440428 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.440415 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:52.440504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.440439 2560 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:52.440504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.440456 2560 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:52.440504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.440465 2560 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:52.440504 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.440499 2560 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:52.451091 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.451069 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 21:16:52.466350 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.466304 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.467100 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.467080 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.467180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.467104 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.467180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.467118 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.467180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.467143 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.475704 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.475648 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.467094583 +0000 UTC m=+0.525882248,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.477942 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.477870 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.467110987 +0000 UTC m=+0.525898650,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.478022 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.477957 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.483384 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.483333 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.467124045 +0000 UTC m=+0.525911710,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.515321 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.515303 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Apr 24 21:16:52.541481 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.541456 2560 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal"] Apr 24 21:16:52.541538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.541528 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.542193 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.542180 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.542248 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.542208 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.542248 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.542219 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.544485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.544473 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.544609 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.544596 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.544656 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.544624 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.545125 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545099 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.545189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545127 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.545189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545140 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.545189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545099 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.545286 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545202 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.545286 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.545215 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.547321 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.547307 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.547374 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.547331 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.548116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.548102 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.548205 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.548131 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.548205 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.548146 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.551705 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.551651 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.542196759 +0000 UTC m=+0.600984432,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.560862 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.560810 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.542212426 +0000 UTC m=+0.601000088,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.564438 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.564420 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.568288 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.568275 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.569392 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.569341 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.542224592 +0000 UTC m=+0.601012255,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.578207 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.578155 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.545117832 +0000 UTC m=+0.603905498,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.586869 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.586818 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.545133064 +0000 UTC m=+0.603920729,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.595911 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.595847 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.545145391 +0000 UTC m=+0.603933053,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.603600 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.603547 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.545192207 +0000 UTC m=+0.603979873,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.610036 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.610017 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.610092 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.610051 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.610092 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.610073 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97ad1a8b57a0086b081c576d14dd05e7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-142.ec2.internal\" (UID: \"97ad1a8b57a0086b081c576d14dd05e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.612369 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.612314 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.545208484 +0000 UTC m=+0.603996146,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.621682 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.621633 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.545220634 +0000 UTC m=+0.604008300,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.635633 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.635566 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.54811534 +0000 UTC m=+0.606903003,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.648533 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.648480 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.548138862 +0000 UTC m=+0.606926530,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.657619 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.657555 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.548151913 +0000 UTC m=+0.606939576,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.678686 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.678665 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.679437 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.679407 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.679514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.679443 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.679514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.679453 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.679514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.679471 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.691311 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.691252 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:52.679432197 +0000 UTC m=+0.738219859,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.701269 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.701201 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:52.679448118 +0000 UTC m=+0.738235780,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.701539 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.701520 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710234 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710213 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710242 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97ad1a8b57a0086b081c576d14dd05e7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-142.ec2.internal\" (UID: \"97ad1a8b57a0086b081c576d14dd05e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710263 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710317 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710318 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/784ad9f5e87f1e75291c25fc06106e5e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal\" (UID: \"784ad9f5e87f1e75291c25fc06106e5e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.710318 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97ad1a8b57a0086b081c576d14dd05e7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-142.ec2.internal\" (UID: \"97ad1a8b57a0086b081c576d14dd05e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.710904 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.710853 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45e151b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45e151b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326724891 +0000 UTC m=+0.385512553,LastTimestamp:2026-04-24 21:16:52.67945643 +0000 UTC m=+0.738244094,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:52.867628 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.867587 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.869171 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:52.869157 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:16:52.925328 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:52.925306 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Apr 24 21:16:53.101861 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.101833 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:53.102661 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.102647 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:53.102723 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.102673 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:53.102723 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.102684 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:53.102723 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.102707 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:53.112483 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.112417 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45db279\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45db279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326699641 +0000 UTC m=+0.385487304,LastTimestamp:2026-04-24 21:16:53.102660834 +0000 UTC m=+1.161448495,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:53.120596 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.120549 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:53.120662 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.120612 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-128-142.ec2.internal.18a96793c45df321\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-142.ec2.internal.18a96793c45df321 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-142.ec2.internal,UID:ip-10-0-128-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-142.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.326716193 +0000 UTC m=+0.385503855,LastTimestamp:2026-04-24 21:16:53.102677815 +0000 UTC m=+1.161465477,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:53.305599 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.305365 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:53.367143 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.367116 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:53.385357 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:53.385328 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ad1a8b57a0086b081c576d14dd05e7.slice/crio-5d157f7c8585b959fc0b90b912b8c6d07bdcc256ef447a218e28aa560da6f380 WatchSource:0}: Error finding container 5d157f7c8585b959fc0b90b912b8c6d07bdcc256ef447a218e28aa560da6f380: Status 404 returned error can't find the container with id 5d157f7c8585b959fc0b90b912b8c6d07bdcc256ef447a218e28aa560da6f380 Apr 24 21:16:53.385589 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:16:53.385565 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784ad9f5e87f1e75291c25fc06106e5e.slice/crio-3c5a72aabddba0e0afc6e6e0279ad471fe809b7a6d77f0b0fc8dbe30194bac80 WatchSource:0}: Error finding container 3c5a72aabddba0e0afc6e6e0279ad471fe809b7a6d77f0b0fc8dbe30194bac80: Status 404 returned error can't find the container with id 3c5a72aabddba0e0afc6e6e0279ad471fe809b7a6d77f0b0fc8dbe30194bac80 Apr 24 21:16:53.389100 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.389086 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:53.400444 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.400370 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-128-142.ec2.internal.18a9679403b3fe98 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-128-142.ec2.internal,UID:97ad1a8b57a0086b081c576d14dd05e7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\",Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:53.389319832 +0000 UTC m=+1.448107481,LastTimestamp:2026-04-24 21:16:53.389319832 +0000 UTC m=+1.448107481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:53.408700 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.408641 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a9679403b4d81d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\",Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:53.389375517 +0000 UTC m=+1.448163167,LastTimestamp:2026-04-24 21:16:53.389375517 +0000 UTC m=+1.448163167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:53.443543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.443503 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerStarted","Data":"3c5a72aabddba0e0afc6e6e0279ad471fe809b7a6d77f0b0fc8dbe30194bac80"} Apr 24 21:16:53.444398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.444371 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" event={"ID":"97ad1a8b57a0086b081c576d14dd05e7","Type":"ContainerStarted","Data":"5d157f7c8585b959fc0b90b912b8c6d07bdcc256ef447a218e28aa560da6f380"} Apr 24 21:16:53.693576 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.693526 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:53.709059 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.709036 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 21:16:53.720969 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.720948 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:53.735161 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.735130 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Apr 24 21:16:53.921480 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.921442 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:53.922737 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.922713 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:53.922851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.922752 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:53.922851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.922768 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:53.922851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:53.922810 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:53.947073 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:53.947000 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:54.314416 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:54.314386 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:54.905866 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:54.905769 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-128-142.ec2.internal.18a967945d76f72e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-128-142.ec2.internal,UID:97ad1a8b57a0086b081c576d14dd05e7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\" in 1.505s (1.505s including waiting). Image size: 488332864 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:54.895269678 +0000 UTC m=+2.954057339,LastTimestamp:2026-04-24 21:16:54.895269678 +0000 UTC m=+2.954057339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:54.914545 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:54.914457 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a967945d92ea5b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" in 1.507s (1.507s including waiting). Image size: 468435751 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:54.897101403 +0000 UTC m=+2.955889066,LastTimestamp:2026-04-24 21:16:54.897101403 +0000 UTC m=+2.955889066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:54.960040 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:54.959849 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-128-142.ec2.internal.18a9679460d0af07 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-128-142.ec2.internal,UID:97ad1a8b57a0086b081c576d14dd05e7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Created,Message:Created container: haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:54.951481095 +0000 UTC m=+3.010268760,LastTimestamp:2026-04-24 21:16:54.951481095 +0000 UTC m=+3.010268760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:54.969134 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:54.969069 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-128-142.ec2.internal.18a96794613f8a5d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-128-142.ec2.internal,UID:97ad1a8b57a0086b081c576d14dd05e7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Started,Message:Started container haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:54.958746205 +0000 UTC m=+3.017533873,LastTimestamp:2026-04-24 21:16:54.958746205 +0000 UTC m=+3.017533873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:55.317006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.316981 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:55.345358 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.345334 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Apr 24 21:16:55.407792 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.407722 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a967947b7e5252 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:55.399068242 +0000 UTC m=+3.457855904,LastTimestamp:2026-04-24 21:16:55.399068242 +0000 UTC m=+3.457855904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:55.416406 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.416343 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a967947bf11cfa openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:55.406591226 +0000 UTC m=+3.465378880,LastTimestamp:2026-04-24 21:16:55.406591226 +0000 UTC m=+3.465378880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:55.448637 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.448612 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerStarted","Data":"aa7370fc97c32f6045f82331663db0e4e2e8700154568adf1b283d63797aa177"} Apr 24 21:16:55.448706 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.448665 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:55.449417 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.449395 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:55.449488 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.449423 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:55.449488 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.449437 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:55.449652 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.449634 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:55.449903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.449888 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" event={"ID":"97ad1a8b57a0086b081c576d14dd05e7","Type":"ContainerStarted","Data":"a9235fc7adee1268c65e4c6f1f78d7e5eb82ac1c3206a95f85f41b2cf427b153"} Apr 24 21:16:55.449968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.449958 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:55.450555 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.450541 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:55.450627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.450566 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:55.450627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.450575 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:55.450702 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.450693 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:55.500915 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.500867 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 21:16:55.547380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.547358 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:55.548126 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.548110 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:55.548181 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.548142 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:55.548181 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.548156 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:55.548256 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:55.548182 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:55.567438 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.567393 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:55.930871 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:55.930827 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:56.312507 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.312488 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:56.454207 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454170 2560 generic.go:358] "Generic (PLEG): container finished" podID="784ad9f5e87f1e75291c25fc06106e5e" containerID="aa7370fc97c32f6045f82331663db0e4e2e8700154568adf1b283d63797aa177" exitCode=0 Apr 24 21:16:56.454496 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454265 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:56.454496 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454270 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:56.454496 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454255 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerDied","Data":"aa7370fc97c32f6045f82331663db0e4e2e8700154568adf1b283d63797aa177"} Apr 24 21:16:56.454976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454961 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:56.455043 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454990 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:56.455043 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.455000 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:56.455121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.454961 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:56.455121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.455070 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:56.455121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:56.455086 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:56.455226 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.455135 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:56.455283 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.455271 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:56.465229 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.465137 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794ba8ccefd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.456982269 +0000 UTC m=+4.515769932,LastTimestamp:2026-04-24 21:16:56.456982269 +0000 UTC m=+4.515769932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:56.536576 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.536548 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:56.566046 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.565999 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:56.566046 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.565956 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c081afbf openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.556916671 +0000 UTC m=+4.615704325,LastTimestamp:2026-04-24 21:16:56.556916671 +0000 UTC m=+4.615704325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:56.574322 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:56.574259 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c1001fd8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.565202904 +0000 UTC m=+4.623990576,LastTimestamp:2026-04-24 21:16:56.565202904 +0000 UTC m=+4.623990576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:57.312528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.312506 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:57.456663 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.456592 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/0.log" Apr 24 21:16:57.457696 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.457675 2560 generic.go:358] "Generic (PLEG): container finished" podID="784ad9f5e87f1e75291c25fc06106e5e" containerID="20ada77477c60d0e886b2f70cc9d1af5bb21018de0b5a10caf0569d18a9682ea" exitCode=1 Apr 24 21:16:57.457738 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.457708 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerDied","Data":"20ada77477c60d0e886b2f70cc9d1af5bb21018de0b5a10caf0569d18a9682ea"} Apr 24 21:16:57.457780 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.457750 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:57.458485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.458470 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:57.458543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.458496 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:57.458543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.458506 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:57.458672 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:57.458660 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:57.458711 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:57.458703 2560 scope.go:117] "RemoveContainer" containerID="20ada77477c60d0e886b2f70cc9d1af5bb21018de0b5a10caf0569d18a9682ea" Apr 24 21:16:57.469338 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:57.469272 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794ba8ccefd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794ba8ccefd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.456982269 +0000 UTC m=+4.515769932,LastTimestamp:2026-04-24 21:16:57.460554215 +0000 UTC m=+5.519341884,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:57.568304 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:57.568238 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c081afbf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c081afbf openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.556916671 +0000 UTC m=+4.615704325,LastTimestamp:2026-04-24 21:16:57.559126475 +0000 UTC m=+5.617914137,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:57.581273 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:57.581202 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c1001fd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c1001fd8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.565202904 +0000 UTC m=+4.623990576,LastTimestamp:2026-04-24 21:16:57.566978654 +0000 UTC m=+5.625766319,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:58.312254 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.312228 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:58.459996 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.459976 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/1.log" Apr 24 21:16:58.460338 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.460312 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/0.log" Apr 24 21:16:58.460586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.460564 2560 generic.go:358] "Generic (PLEG): container finished" podID="784ad9f5e87f1e75291c25fc06106e5e" containerID="ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09" exitCode=1 Apr 24 21:16:58.460686 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.460596 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerDied","Data":"ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09"} Apr 24 21:16:58.460686 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.460631 2560 scope.go:117] "RemoveContainer" containerID="20ada77477c60d0e886b2f70cc9d1af5bb21018de0b5a10caf0569d18a9682ea" Apr 24 21:16:58.460686 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.460636 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:58.461337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.461322 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:58.461398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.461346 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:58.461398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.461359 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:58.461738 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:58.461572 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:58.461738 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.461613 2560 scope.go:117] "RemoveContainer" containerID="ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09" Apr 24 21:16:58.461808 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:58.461735 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podUID="784ad9f5e87f1e75291c25fc06106e5e" Apr 24 21:16:58.470535 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:58.470468 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96795320a775c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e),Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:58.461706076 +0000 UTC m=+6.520493737,LastTimestamp:2026-04-24 21:16:58.461706076 +0000 UTC m=+6.520493737,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:16:58.553743 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:58.553721 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Apr 24 21:16:58.767583 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.767568 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:58.768875 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.768852 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:58.768966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.768884 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:58.768966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.768899 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:58.768966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:58.768943 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:58.785875 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:58.785853 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:59.312035 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.312011 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:59.462876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.462852 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/1.log" Apr 24 21:16:59.463250 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.463236 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:59.463973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.463959 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:59.464022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.463987 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:59.464022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.463997 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:59.464194 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:59.464181 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:16:59.464249 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:16:59.464226 2560 scope.go:117] "RemoveContainer" containerID="ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09" Apr 24 21:16:59.464353 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:59.464338 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podUID="784ad9f5e87f1e75291c25fc06106e5e" Apr 24 21:16:59.473469 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:16:59.473396 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96795320a775c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96795320a775c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e),Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:58.461706076 +0000 UTC m=+6.520493737,LastTimestamp:2026-04-24 21:16:59.46431128 +0000 UTC m=+7.523098945,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:17:00.311780 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:00.311748 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:00.662364 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:00.662284 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 21:17:00.662364 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:00.662284 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:17:01.310976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:01.310944 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:01.794907 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:01.794877 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:17:01.983060 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:01.983028 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:17:02.310179 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:02.310159 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:02.366569 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:02.366540 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:03.312469 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:03.312436 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:04.314355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:04.314325 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:04.963109 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:04.963076 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 24 21:17:05.186589 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.186545 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:05.187489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.187472 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:05.187591 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.187506 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:05.187591 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.187521 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:05.187591 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.187556 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:05.204908 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:05.204880 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:05.312410 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:05.312389 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:06.318576 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:06.318544 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:07.312486 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:07.312458 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:08.312167 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:08.312138 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:08.396911 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:08.396883 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 21:17:09.311602 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:09.311574 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:10.312765 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.312736 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:10.441528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.441507 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:10.442416 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.442396 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:10.442502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.442427 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:10.442502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.442438 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:10.442656 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:10.442644 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:10.442699 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:10.442690 2560 scope.go:117] "RemoveContainer" containerID="ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09" Apr 24 21:17:10.453518 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:10.453421 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794ba8ccefd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794ba8ccefd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.456982269 +0000 UTC m=+4.515769932,LastTimestamp:2026-04-24 21:17:10.444638959 +0000 UTC m=+18.503426629,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:17:10.547605 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:10.547505 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c081afbf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c081afbf openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.556916671 +0000 UTC m=+4.615704325,LastTimestamp:2026-04-24 21:17:10.538154254 +0000 UTC m=+18.596941919,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:17:10.557430 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:10.557359 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c1001fd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96794c1001fd8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:56.565202904 +0000 UTC m=+4.623990576,LastTimestamp:2026-04-24 21:17:10.545762106 +0000 UTC m=+18.604549774,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:17:11.311716 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.311683 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:11.479596 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.479575 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:17:11.479978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.479894 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/1.log" Apr 24 21:17:11.480264 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.480241 2560 generic.go:358] "Generic (PLEG): container finished" podID="784ad9f5e87f1e75291c25fc06106e5e" containerID="d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff" exitCode=1 Apr 24 21:17:11.480343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.480278 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerDied","Data":"d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff"} Apr 24 21:17:11.480343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.480309 2560 scope.go:117] "RemoveContainer" containerID="ac08de7ee96723815c174c8fcc4392be7ef6e345a94a668215549563c75dce09" Apr 24 21:17:11.480429 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.480385 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:11.481138 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.481122 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:11.481225 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.481152 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:11.481225 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.481165 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:11.481504 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:11.481417 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:11.481504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:11.481469 2560 scope.go:117] "RemoveContainer" containerID="d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff" Apr 24 21:17:11.481648 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:11.481629 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podUID="784ad9f5e87f1e75291c25fc06106e5e" Apr 24 21:17:11.489561 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:11.489495 2560 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96795320a775c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal.18a96795320a775c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal,UID:784ad9f5e87f1e75291c25fc06106e5e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e),Source:EventSource{Component:kubelet,Host:ip-10-0-128-142.ec2.internal,},FirstTimestamp:2026-04-24 21:16:58.461706076 +0000 UTC m=+6.520493737,LastTimestamp:2026-04-24 21:17:11.481587972 +0000 UTC m=+19.540375640,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-142.ec2.internal,}" Apr 24 21:17:11.737583 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:11.737528 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:17:11.972520 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:11.972498 2560 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 24 21:17:12.205804 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.205786 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:12.206611 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.206596 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:12.206666 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.206625 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:12.206666 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.206639 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:12.206746 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.206668 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:12.223147 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:12.223126 2560 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-128-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:12.311685 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.311660 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:12.367100 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:12.367066 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:12.482712 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:12.482656 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:17:12.864335 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:12.864311 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:17:13.311443 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:13.311416 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:13.353844 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:13.353819 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:17:14.311824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:14.311796 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:15.312792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:15.312767 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:16.313605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:16.313575 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:17:16.762605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:16.762577 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zpk2d" Apr 24 21:17:17.226819 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.226775 2560 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:17:17.325196 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.325176 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.344776 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.344760 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.403359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.403317 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.676987 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.676937 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.676987 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:17.676957 2560 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.712293 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.712277 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.728061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.728047 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:17.763466 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.763434 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:12:16 +0000 UTC" deadline="2028-02-06 20:16:27.803275012 +0000 UTC" Apr 24 21:17:17.763466 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.763464 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15670h59m10.039813548s" Apr 24 21:17:17.783212 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:17.783198 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.053681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:18.053664 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.053760 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:18.053683 2560 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.311504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:18.311454 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.327506 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:18.327490 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.385609 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:18.385589 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.649339 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:18.649279 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.649339 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:18.649297 2560 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-128-142.ec2.internal" not found Apr 24 21:17:18.977908 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:18.977843 2560 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:19.223744 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.223712 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:19.224736 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.224718 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:19.224823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.224750 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:19.224823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.224759 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:19.224823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.224803 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:19.233967 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.233906 2560 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:19.233967 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.233945 2560 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-142.ec2.internal\": node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.260575 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.260554 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.321673 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.321658 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:17:19.337662 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.337645 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:17:19.361063 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.361046 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.397936 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.397902 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h7mj2" Apr 24 21:17:19.406817 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:19.406802 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h7mj2" Apr 24 21:17:19.461229 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.461210 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.561479 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.561462 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.661918 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.661896 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.762419 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.762397 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.863056 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.863017 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:19.963442 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:19.963428 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.063887 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.063866 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.164296 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.164266 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.264775 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.264755 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.365440 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.365419 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.408198 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:20.408178 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:12:19 +0000 UTC" deadline="2027-09-26 19:46:48.100602417 +0000 UTC" Apr 24 21:17:20.408198 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:20.408196 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12478h29m27.692409333s" Apr 24 21:17:20.465542 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.465505 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.565600 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.565577 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.666087 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.666070 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.766646 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.766625 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.867577 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.867550 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:20.968055 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:20.968034 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.068480 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.068430 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.169245 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.169224 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.269471 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.269450 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.370155 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.370121 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.408534 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:21.408510 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:12:19 +0000 UTC" deadline="2028-01-03 21:41:24.968676726 +0000 UTC" Apr 24 21:17:21.408534 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:21.408531 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14856h24m3.560147912s" Apr 24 21:17:21.470913 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.470896 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.571499 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.571479 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.672005 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.671963 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.772534 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.772510 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.873167 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.873151 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:21.974066 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:21.974036 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.074469 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.074451 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.174913 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.174889 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.275681 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.275664 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.367751 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.367734 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.376718 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.376699 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.440603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:22.440586 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:17:22.441426 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:22.441413 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:17:22.441542 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:22.441439 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:17:22.441542 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:22.441453 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:17:22.441697 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.441683 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-142.ec2.internal\" not found" node="ip-10-0-128-142.ec2.internal" Apr 24 21:17:22.441750 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:22.441740 2560 scope.go:117] "RemoveContainer" containerID="d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff" Apr 24 21:17:22.441881 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.441866 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podUID="784ad9f5e87f1e75291c25fc06106e5e" Apr 24 21:17:22.476850 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.476827 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.576952 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.576910 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.677609 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.677586 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.778594 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.778567 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.879259 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.879199 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:22.979889 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:22.979871 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.080324 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.080301 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.180984 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.180945 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.281517 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.281496 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.381912 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.381887 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.482106 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.482062 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.582327 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.582302 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.682769 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.682752 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.782990 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.782968 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.883528 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.883507 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:23.984299 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:23.984279 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.085212 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.085177 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.185666 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.185640 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.286434 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.286411 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.387487 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.387449 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.488129 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.488113 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.588368 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.588351 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.688843 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.688804 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.789392 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.789371 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.889830 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.889809 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:24.990883 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:24.990857 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:25.091325 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.091299 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-142.ec2.internal\" not found" Apr 24 21:17:25.144725 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.143595 2560 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:17:25.206566 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.206544 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" Apr 24 21:17:25.219113 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.219095 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:17:25.220043 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.220029 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" Apr 24 21:17:25.226118 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.226104 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:17:25.304225 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.304206 2560 apiserver.go:52] "Watching apiserver" Apr 24 21:17:25.310903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.310885 2560 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:17:25.311802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.311784 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-r5z5s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal","openshift-network-diagnostics/network-check-target-qbhf5","openshift-ovn-kubernetes/ovnkube-node-6rbsp","kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs","openshift-cluster-node-tuning-operator/tuned-97kf5","openshift-dns/node-resolver-x6t58","openshift-multus/multus-additional-cni-plugins-kqxwj","openshift-multus/multus-df2mh","openshift-multus/network-metrics-daemon-ldwdd","openshift-network-operator/iptables-alerter-q6gt8","kube-system/konnectivity-agent-t6vpk"] Apr 24 21:17:25.315902 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.315888 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.315997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.315983 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:25.316062 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.316034 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:25.317073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.317054 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.318002 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.317985 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.318345 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.318329 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.318603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.318591 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.318853 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.318839 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s2wnh\"" Apr 24 21:17:25.319999 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.319984 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.320096 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320081 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.320230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320215 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:17:25.320292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320229 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:17:25.320498 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320483 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:17:25.320557 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320549 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:17:25.320710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320696 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.320751 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320716 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.320810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320793 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.320857 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320835 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hhs9n\"" Apr 24 21:17:25.320906 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.320886 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:17:25.321161 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.321145 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.321239 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.321227 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pml99\"" Apr 24 21:17:25.321303 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.321285 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.322399 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322185 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.322399 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322264 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322454 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322515 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322601 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.322716 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tz8dc\"" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.323013 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:17:25.323337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.323104 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9hfvp\"" Apr 24 21:17:25.323757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.323744 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.323828 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.323797 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:25.324594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324574 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:17:25.324681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324633 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:17:25.324681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324668 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.324792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324576 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:17:25.324792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324633 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.324792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324762 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:17:25.324792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324752 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tb85r\"" Apr 24 21:17:25.325046 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.324957 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vldl6\"" Apr 24 21:17:25.325046 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.325003 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.326599 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.326583 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.326678 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.326621 2560 scope.go:117] "RemoveContainer" containerID="d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff" Apr 24 21:17:25.326828 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.326810 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_openshift-machine-config-operator(784ad9f5e87f1e75291c25fc06106e5e)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podUID="784ad9f5e87f1e75291c25fc06106e5e" Apr 24 21:17:25.327164 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.327147 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:25.327233 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.327223 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:25.327312 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.327296 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:17:25.327413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.327399 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n99pp\"" Apr 24 21:17:25.328715 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.328701 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:17:25.328819 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.328806 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:17:25.329049 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.329036 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k77jb\"" Apr 24 21:17:25.376511 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.376478 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-142.ec2.internal" podStartSLOduration=0.376469155 podStartE2EDuration="376.469155ms" podCreationTimestamp="2026-04-24 21:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:25.376441792 +0000 UTC m=+33.435229462" watchObservedRunningTime="2026-04-24 21:17:25.376469155 +0000 UTC m=+33.435256816" Apr 24 21:17:25.407604 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.407582 2560 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:17:25.475894 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475878 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-log-socket\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475899 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-modprobe-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475915 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-lib-modules\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475949 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-node-log\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475966 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-bin\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475980 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-host\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.475998 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.475993 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4deb568-1692-4033-a2a6-45866b8c89db-hosts-file\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476006 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cnibin\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476020 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cnibin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476032 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-kubelet\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476046 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-device-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476069 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-var-lib-kubelet\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476105 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-netns\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476129 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-systemd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476148 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-bin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476197 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.476266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476248 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrnc\" (UniqueName: \"kubernetes.io/projected/260be1ca-939b-4b9e-9f93-078d2506aef0-kube-api-access-fmrnc\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476282 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fj9n\" (UniqueName: \"kubernetes.io/projected/ce514c8a-dcef-4be4-9c59-0a9305bef822-kube-api-access-9fj9n\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476305 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476344 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-multus\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476366 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476392 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-sys\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476414 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6ww\" (UniqueName: \"kubernetes.io/projected/a5ac4a85-e0fb-4193-bc28-23442097690b-kube-api-access-lj6ww\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476436 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-slash\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476459 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476487 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476524 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-netd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476566 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce514c8a-dcef-4be4-9c59-0a9305bef822-serviceca\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476597 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4deb568-1692-4033-a2a6-45866b8c89db-tmp-dir\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.476616 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476613 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-socket-dir-parent\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476633 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscr5\" (UniqueName: \"kubernetes.io/projected/6159f82c-9148-4579-8103-7b0956bd6ce8-kube-api-access-tscr5\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476650 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce514c8a-dcef-4be4-9c59-0a9305bef822-host\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476676 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-k8s-cni-cncf-io\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476690 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05f6587b-f5da-428a-8968-80f271212138-agent-certs\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476704 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-systemd\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476718 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476731 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-ovn\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476745 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476762 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-socket-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476786 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-sys-fs\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476814 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6159f82c-9148-4579-8103-7b0956bd6ce8-iptables-alerter-script\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476837 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-kubernetes\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476851 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-etc-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476865 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqm2r\" (UniqueName: \"kubernetes.io/projected/b4deb568-1692-4033-a2a6-45866b8c89db-kube-api-access-rqm2r\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.477029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476880 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cni-binary-copy\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476903 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-multus-certs\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476941 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05f6587b-f5da-428a-8968-80f271212138-konnectivity-ca\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476965 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-conf\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.476985 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-tmp\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477002 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-kubelet\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477016 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-script-lib\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477029 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-os-release\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477047 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysconfig\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477064 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-systemd-units\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477079 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477094 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxxs\" (UniqueName: \"kubernetes.io/projected/849dc6e4-06dd-4834-a5eb-de6ceebd649f-kube-api-access-sfxxs\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477112 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477136 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl7st\" (UniqueName: \"kubernetes.io/projected/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-kube-api-access-jl7st\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477156 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-var-lib-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477176 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.477505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477195 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-conf-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477217 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-daemon-config\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477236 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6159f82c-9148-4579-8103-7b0956bd6ce8-host-slash\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477251 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-run\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477267 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-os-release\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477281 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-hostroot\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477301 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g7fx\" (UniqueName: \"kubernetes.io/projected/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-kube-api-access-6g7fx\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477316 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-registration-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477330 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpt6l\" (UniqueName: \"kubernetes.io/projected/3ef140eb-72c8-462e-9469-9aa900c0be05-kube-api-access-jpt6l\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477343 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-tuned\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477364 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-env-overrides\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477386 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-system-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477410 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-netns\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477432 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-config\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477455 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260be1ca-939b-4b9e-9f93-078d2506aef0-ovn-node-metrics-cert\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477475 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-system-cni-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.478000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477490 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.478427 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477504 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-etc-kubernetes\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.478427 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.477517 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-etc-selinux\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.578592 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578545 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6ww\" (UniqueName: \"kubernetes.io/projected/a5ac4a85-e0fb-4193-bc28-23442097690b-kube-api-access-lj6ww\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.578592 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578582 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-slash\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578602 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578618 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578633 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-netd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578654 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce514c8a-dcef-4be4-9c59-0a9305bef822-serviceca\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578676 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4deb568-1692-4033-a2a6-45866b8c89db-tmp-dir\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578685 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578695 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-slash\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.578724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578700 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-socket-dir-parent\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578710 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-netd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578770 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-socket-dir-parent\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578853 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tscr5\" (UniqueName: \"kubernetes.io/projected/6159f82c-9148-4579-8103-7b0956bd6ce8-kube-api-access-tscr5\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578891 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce514c8a-dcef-4be4-9c59-0a9305bef822-host\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578917 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578959 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-k8s-cni-cncf-io\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578978 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce514c8a-dcef-4be4-9c59-0a9305bef822-host\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578986 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05f6587b-f5da-428a-8968-80f271212138-agent-certs\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.578957 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4deb568-1692-4033-a2a6-45866b8c89db-tmp-dir\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579023 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-systemd\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579044 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-k8s-cni-cncf-io\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579049 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579076 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-ovn\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579079 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.579080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579085 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-systemd\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579101 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579125 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-ovn\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579113 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579138 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-socket-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579168 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-sys-fs\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579190 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6159f82c-9148-4579-8103-7b0956bd6ce8-iptables-alerter-script\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579211 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-kubernetes\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579231 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-etc-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579250 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqm2r\" (UniqueName: \"kubernetes.io/projected/b4deb568-1692-4033-a2a6-45866b8c89db-kube-api-access-rqm2r\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579269 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cni-binary-copy\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579275 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-sys-fs\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579290 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-multus-certs\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579310 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05f6587b-f5da-428a-8968-80f271212138-konnectivity-ca\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579333 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-conf\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579353 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-tmp\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579362 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-multus-certs\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579375 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-kubelet\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.579767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579389 2560 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579399 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-script-lib\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579424 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-os-release\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579446 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysconfig\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579475 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-systemd-units\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579505 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579512 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-conf\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579535 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxxs\" (UniqueName: \"kubernetes.io/projected/849dc6e4-06dd-4834-a5eb-de6ceebd649f-kube-api-access-sfxxs\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579550 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-etc-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579562 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579586 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl7st\" (UniqueName: \"kubernetes.io/projected/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-kube-api-access-jl7st\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579592 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579642 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-var-lib-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579689 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysconfig\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579714 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-var-lib-openvswitch\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579718 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579759 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-conf-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.580601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579767 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579784 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-daemon-config\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579826 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6159f82c-9148-4579-8103-7b0956bd6ce8-host-slash\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579846 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-run\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579892 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-os-release\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579906 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cni-binary-copy\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579913 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-hostroot\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579971 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-hostroot\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579971 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g7fx\" (UniqueName: \"kubernetes.io/projected/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-kube-api-access-6g7fx\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580003 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6159f82c-9148-4579-8103-7b0956bd6ce8-iptables-alerter-script\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580011 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-registration-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580035 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpt6l\" (UniqueName: \"kubernetes.io/projected/3ef140eb-72c8-462e-9469-9aa900c0be05-kube-api-access-jpt6l\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580049 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05f6587b-f5da-428a-8968-80f271212138-konnectivity-ca\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580053 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-conf-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580056 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-tuned\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579308 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-kubernetes\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580092 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-env-overrides\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580116 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-system-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.581375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580120 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-run\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580147 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-netns\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580161 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-system-cni-dir\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580171 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-config\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580194 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260be1ca-939b-4b9e-9f93-078d2506aef0-ovn-node-metrics-cert\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580218 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-system-cni-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580238 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580258 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-etc-kubernetes\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580277 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-etc-selinux\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580297 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-log-socket\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580317 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-modprobe-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580330 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-script-lib\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580339 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-lib-modules\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580342 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580359 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-node-log\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579393 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-socket-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580398 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-node-log\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580403 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-os-release\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580405 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6159f82c-9148-4579-8103-7b0956bd6ce8-host-slash\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579809 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-kubelet\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580428 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-bin\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580427 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-run-netns\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580459 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-host\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580484 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4deb568-1692-4033-a2a6-45866b8c89db-hosts-file\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580487 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-registration-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580510 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cnibin\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580537 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cnibin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580561 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-kubelet\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580587 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-device-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580610 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-var-lib-kubelet\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580635 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-netns\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580660 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-systemd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580685 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-bin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580711 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580716 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-env-overrides\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.582594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580736 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrnc\" (UniqueName: \"kubernetes.io/projected/260be1ca-939b-4b9e-9f93-078d2506aef0-kube-api-access-fmrnc\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580751 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-cnibin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580762 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fj9n\" (UniqueName: \"kubernetes.io/projected/ce514c8a-dcef-4be4-9c59-0a9305bef822-kube-api-access-9fj9n\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580789 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580816 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-multus\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580825 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-host\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580843 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580862 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-system-cni-dir\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580868 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-sys\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580969 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-sys\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580538 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-os-release\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581018 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-kubelet\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581059 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-device-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581103 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-var-lib-kubelet\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581139 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-run-netns\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581155 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-multus-daemon-config\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581174 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-run-systemd\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581210 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-bin\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583102 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581212 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-etc-kubernetes\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581304 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-etc-selinux\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581342 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-log-socket\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580791 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-host-cni-bin\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581422 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-sysctl-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581634 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-modprobe-d\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581672 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/260be1ca-939b-4b9e-9f93-078d2506aef0-systemd-units\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581704 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.581745 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.579394 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce514c8a-dcef-4be4-9c59-0a9305bef822-serviceca\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.580824 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260be1ca-939b-4b9e-9f93-078d2506aef0-ovnkube-config\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581803 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5ac4a85-e0fb-4193-bc28-23442097690b-lib-modules\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581849 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581870 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ef140eb-72c8-462e-9469-9aa900c0be05-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.581885 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:26.081854609 +0000 UTC m=+34.140642261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581912 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/849dc6e4-06dd-4834-a5eb-de6ceebd649f-cnibin\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581917 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-host-var-lib-cni-multus\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.583645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.581972 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4deb568-1692-4033-a2a6-45866b8c89db-hosts-file\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.584115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.583632 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-etc-tuned\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.584115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.583648 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5ac4a85-e0fb-4193-bc28-23442097690b-tmp\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.584115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.583792 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260be1ca-939b-4b9e-9f93-078d2506aef0-ovn-node-metrics-cert\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.584115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.583804 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05f6587b-f5da-428a-8968-80f271212138-agent-certs\") pod \"konnectivity-agent-t6vpk\" (UID: \"05f6587b-f5da-428a-8968-80f271212138\") " pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.584822 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.584803 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:25.584901 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.584826 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:25.584901 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.584839 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:25.584901 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:25.584897 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:26.084875279 +0000 UTC m=+34.143662933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:25.588108 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.587798 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscr5\" (UniqueName: \"kubernetes.io/projected/6159f82c-9148-4579-8103-7b0956bd6ce8-kube-api-access-tscr5\") pod \"iptables-alerter-q6gt8\" (UID: \"6159f82c-9148-4579-8103-7b0956bd6ce8\") " pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.588108 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.587684 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g7fx\" (UniqueName: \"kubernetes.io/projected/d0fb6fb7-0535-4776-8fae-ef83f9bfdcca-kube-api-access-6g7fx\") pod \"multus-df2mh\" (UID: \"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca\") " pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.588805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.588583 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqm2r\" (UniqueName: \"kubernetes.io/projected/b4deb568-1692-4033-a2a6-45866b8c89db-kube-api-access-rqm2r\") pod \"node-resolver-x6t58\" (UID: \"b4deb568-1692-4033-a2a6-45866b8c89db\") " pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.588805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.588758 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpt6l\" (UniqueName: \"kubernetes.io/projected/3ef140eb-72c8-462e-9469-9aa900c0be05-kube-api-access-jpt6l\") pod \"aws-ebs-csi-driver-node-2t4hs\" (UID: \"3ef140eb-72c8-462e-9469-9aa900c0be05\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.588996 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.588907 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6ww\" (UniqueName: \"kubernetes.io/projected/a5ac4a85-e0fb-4193-bc28-23442097690b-kube-api-access-lj6ww\") pod \"tuned-97kf5\" (UID: \"a5ac4a85-e0fb-4193-bc28-23442097690b\") " pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.590541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.589740 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl7st\" (UniqueName: \"kubernetes.io/projected/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-kube-api-access-jl7st\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:25.590541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.590326 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fj9n\" (UniqueName: \"kubernetes.io/projected/ce514c8a-dcef-4be4-9c59-0a9305bef822-kube-api-access-9fj9n\") pod \"node-ca-r5z5s\" (UID: \"ce514c8a-dcef-4be4-9c59-0a9305bef822\") " pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.591436 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.591158 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrnc\" (UniqueName: \"kubernetes.io/projected/260be1ca-939b-4b9e-9f93-078d2506aef0-kube-api-access-fmrnc\") pod \"ovnkube-node-6rbsp\" (UID: \"260be1ca-939b-4b9e-9f93-078d2506aef0\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.592492 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.592473 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxxs\" (UniqueName: \"kubernetes.io/projected/849dc6e4-06dd-4834-a5eb-de6ceebd649f-kube-api-access-sfxxs\") pod \"multus-additional-cni-plugins-kqxwj\" (UID: \"849dc6e4-06dd-4834-a5eb-de6ceebd649f\") " pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.625159 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.625140 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-97kf5" Apr 24 21:17:25.631093 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.631074 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:25.631433 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.631413 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ac4a85_e0fb_4193_bc28_23442097690b.slice/crio-9739380050b4ec3a8a3491cf199e0a376b4ccc226443da6c99ea5996624d9a09 WatchSource:0}: Error finding container 9739380050b4ec3a8a3491cf199e0a376b4ccc226443da6c99ea5996624d9a09: Status 404 returned error can't find the container with id 9739380050b4ec3a8a3491cf199e0a376b4ccc226443da6c99ea5996624d9a09 Apr 24 21:17:25.636558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.636536 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" Apr 24 21:17:25.636710 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.636693 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260be1ca_939b_4b9e_9f93_078d2506aef0.slice/crio-72166769ed7e488e47e7aabc5859c0a31c88f2cd861c66a55fac9b1d0e414b91 WatchSource:0}: Error finding container 72166769ed7e488e47e7aabc5859c0a31c88f2cd861c66a55fac9b1d0e414b91: Status 404 returned error can't find the container with id 72166769ed7e488e47e7aabc5859c0a31c88f2cd861c66a55fac9b1d0e414b91 Apr 24 21:17:25.643349 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.643332 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r5z5s" Apr 24 21:17:25.644613 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.644560 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ef140eb_72c8_462e_9469_9aa900c0be05.slice/crio-8feb047800a07d6b3be5fe243f1dda289da072d0beeea860827d6063ea9c7907 WatchSource:0}: Error finding container 8feb047800a07d6b3be5fe243f1dda289da072d0beeea860827d6063ea9c7907: Status 404 returned error can't find the container with id 8feb047800a07d6b3be5fe243f1dda289da072d0beeea860827d6063ea9c7907 Apr 24 21:17:25.648332 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.648316 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x6t58" Apr 24 21:17:25.649212 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.649194 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce514c8a_dcef_4be4_9c59_0a9305bef822.slice/crio-93c99b7d06ec99048104b81d497542f1a1f423d4b94324bd0afc883cd545c3c0 WatchSource:0}: Error finding container 93c99b7d06ec99048104b81d497542f1a1f423d4b94324bd0afc883cd545c3c0: Status 404 returned error can't find the container with id 93c99b7d06ec99048104b81d497542f1a1f423d4b94324bd0afc883cd545c3c0 Apr 24 21:17:25.652910 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.652895 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" Apr 24 21:17:25.654413 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.654379 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4deb568_1692_4033_a2a6_45866b8c89db.slice/crio-3b5cd6570f37113ad568ad9d19e685b7bd4530324d1496247f9f1b0a1543862f WatchSource:0}: Error finding container 3b5cd6570f37113ad568ad9d19e685b7bd4530324d1496247f9f1b0a1543862f: Status 404 returned error can't find the container with id 3b5cd6570f37113ad568ad9d19e685b7bd4530324d1496247f9f1b0a1543862f Apr 24 21:17:25.658570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.658550 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-df2mh" Apr 24 21:17:25.658826 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.658803 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849dc6e4_06dd_4834_a5eb_de6ceebd649f.slice/crio-3118473a2451c4d8a3b38fbf1bcb7278f78a172be85746e820601a34b3dca7ff WatchSource:0}: Error finding container 3118473a2451c4d8a3b38fbf1bcb7278f78a172be85746e820601a34b3dca7ff: Status 404 returned error can't find the container with id 3118473a2451c4d8a3b38fbf1bcb7278f78a172be85746e820601a34b3dca7ff Apr 24 21:17:25.663969 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.663953 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q6gt8" Apr 24 21:17:25.664515 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.664500 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0fb6fb7_0535_4776_8fae_ef83f9bfdcca.slice/crio-218c8f307d0772395e3f23211151a39a66668c90c7b34bb49ac643b64e23c72f WatchSource:0}: Error finding container 218c8f307d0772395e3f23211151a39a66668c90c7b34bb49ac643b64e23c72f: Status 404 returned error can't find the container with id 218c8f307d0772395e3f23211151a39a66668c90c7b34bb49ac643b64e23c72f Apr 24 21:17:25.668355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:25.668212 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:25.671000 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.670979 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6159f82c_9148_4579_8103_7b0956bd6ce8.slice/crio-3eef8e242a17e887a14885bdbc66bf295ec64794a2031a05042216489c0f511a WatchSource:0}: Error finding container 3eef8e242a17e887a14885bdbc66bf295ec64794a2031a05042216489c0f511a: Status 404 returned error can't find the container with id 3eef8e242a17e887a14885bdbc66bf295ec64794a2031a05042216489c0f511a Apr 24 21:17:25.675189 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:25.675076 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f6587b_f5da_428a_8968_80f271212138.slice/crio-62bd4f35411b3d65cea5bfbdbebd8182a8dc0d6363f9fbc75cf4e13fb1907af9 WatchSource:0}: Error finding container 62bd4f35411b3d65cea5bfbdbebd8182a8dc0d6363f9fbc75cf4e13fb1907af9: Status 404 returned error can't find the container with id 62bd4f35411b3d65cea5bfbdbebd8182a8dc0d6363f9fbc75cf4e13fb1907af9 Apr 24 21:17:26.083654 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.083066 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:26.083654 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.083206 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:26.083654 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.083279 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.083260729 +0000 UTC m=+35.142048384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:26.184219 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.184184 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:26.184404 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.184388 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:26.184469 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.184427 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:26.184469 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.184443 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:26.184562 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:26.184501 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.184481815 +0000 UTC m=+35.243269469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:26.509777 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.509733 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-97kf5" event={"ID":"a5ac4a85-e0fb-4193-bc28-23442097690b","Type":"ContainerStarted","Data":"9739380050b4ec3a8a3491cf199e0a376b4ccc226443da6c99ea5996624d9a09"} Apr 24 21:17:26.511229 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.511199 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q6gt8" event={"ID":"6159f82c-9148-4579-8103-7b0956bd6ce8","Type":"ContainerStarted","Data":"3eef8e242a17e887a14885bdbc66bf295ec64794a2031a05042216489c0f511a"} Apr 24 21:17:26.513223 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.513197 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-df2mh" event={"ID":"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca","Type":"ContainerStarted","Data":"218c8f307d0772395e3f23211151a39a66668c90c7b34bb49ac643b64e23c72f"} Apr 24 21:17:26.514901 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.514877 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerStarted","Data":"3118473a2451c4d8a3b38fbf1bcb7278f78a172be85746e820601a34b3dca7ff"} Apr 24 21:17:26.516579 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.516555 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r5z5s" event={"ID":"ce514c8a-dcef-4be4-9c59-0a9305bef822","Type":"ContainerStarted","Data":"93c99b7d06ec99048104b81d497542f1a1f423d4b94324bd0afc883cd545c3c0"} Apr 24 21:17:26.518262 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.518241 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" event={"ID":"3ef140eb-72c8-462e-9469-9aa900c0be05","Type":"ContainerStarted","Data":"8feb047800a07d6b3be5fe243f1dda289da072d0beeea860827d6063ea9c7907"} Apr 24 21:17:26.522219 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.522196 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"72166769ed7e488e47e7aabc5859c0a31c88f2cd861c66a55fac9b1d0e414b91"} Apr 24 21:17:26.529380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.527791 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t6vpk" event={"ID":"05f6587b-f5da-428a-8968-80f271212138","Type":"ContainerStarted","Data":"62bd4f35411b3d65cea5bfbdbebd8182a8dc0d6363f9fbc75cf4e13fb1907af9"} Apr 24 21:17:26.532978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:26.532956 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x6t58" event={"ID":"b4deb568-1692-4033-a2a6-45866b8c89db","Type":"ContainerStarted","Data":"3b5cd6570f37113ad568ad9d19e685b7bd4530324d1496247f9f1b0a1543862f"} Apr 24 21:17:27.094184 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.094098 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:27.094344 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.094224 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:27.094344 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.094282 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:29.094262576 +0000 UTC m=+37.153050231 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:27.195415 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.195383 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:27.195574 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.195532 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:27.195574 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.195550 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:27.195574 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.195573 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:27.195739 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.195636 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:29.195616295 +0000 UTC m=+37.254403944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:27.441395 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.441323 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:27.441554 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.441459 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:27.441871 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.441851 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:27.441974 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.441956 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:27.493784 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.493756 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nk67g"] Apr 24 21:17:27.495802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.495781 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.506943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.505133 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:27.506943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.505553 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:27.506943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.505761 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:27.506943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.506165 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2svx2\"" Apr 24 21:17:27.506943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.506369 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:17:27.508276 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.507856 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:17:27.508276 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.508018 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601430 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-sys\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601472 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-wtmp\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601507 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601552 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-accelerators-collector-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601589 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601622 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-textfile\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601649 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-metrics-client-ca\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601688 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rfz\" (UniqueName: \"kubernetes.io/projected/076d7d40-dc29-4871-a077-5e08e9154463-kube-api-access-s9rfz\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.601824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.601727 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-root\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704334 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704245 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-root\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704334 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704309 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-sys\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704341 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-wtmp\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704346 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-root\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704376 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704414 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-sys\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704441 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-accelerators-collector-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704479 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704515 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-textfile\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704548 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-wtmp\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704585 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-metrics-client-ca\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.704865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.704628 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rfz\" (UniqueName: \"kubernetes.io/projected/076d7d40-dc29-4871-a077-5e08e9154463-kube-api-access-s9rfz\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.706208 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.705121 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-accelerators-collector-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.706208 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.705133 2560 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:27.706208 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:27.705192 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls podName:076d7d40-dc29-4871-a077-5e08e9154463 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:28.205173905 +0000 UTC m=+36.263961556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls") pod "node-exporter-nk67g" (UID: "076d7d40-dc29-4871-a077-5e08e9154463") : secret "node-exporter-tls" not found Apr 24 21:17:27.706208 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.706029 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/076d7d40-dc29-4871-a077-5e08e9154463-metrics-client-ca\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.706208 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.706168 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-textfile\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.710030 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.709982 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.732966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.732941 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rfz\" (UniqueName: \"kubernetes.io/projected/076d7d40-dc29-4871-a077-5e08e9154463-kube-api-access-s9rfz\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:27.982903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:27.982830 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:17:28.208309 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:28.208275 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:28.213234 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:28.212739 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/076d7d40-dc29-4871-a077-5e08e9154463-node-exporter-tls\") pod \"node-exporter-nk67g\" (UID: \"076d7d40-dc29-4871-a077-5e08e9154463\") " pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:28.411431 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:28.411371 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nk67g" Apr 24 21:17:29.117379 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:29.117342 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:29.117839 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.117526 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:29.117839 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.117599 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.117578864 +0000 UTC m=+41.176366519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:29.217936 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:29.217889 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:29.218101 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.218085 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:29.218160 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.218104 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:29.218160 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.218118 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:29.218257 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.218169 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.218151519 +0000 UTC m=+41.276939175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:29.441748 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:29.441099 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:29.441748 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.441214 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:29.441748 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:29.441562 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:29.441748 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:29.441662 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:30.708801 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:30.708440 2560 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:17:31.441490 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:31.441032 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:31.441490 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:31.441064 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:31.441490 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:31.441162 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:31.441490 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:31.441324 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:33.150343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:33.150306 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:33.150814 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.150429 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:33.150814 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.150481 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:41.150464454 +0000 UTC m=+49.209252105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:33.251590 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:33.251040 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:33.251590 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.251192 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:33.251590 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.251209 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:33.251590 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.251220 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:33.251590 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.251277 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:41.251258198 +0000 UTC m=+49.310045850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:33.440838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:33.440662 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:33.440838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:33.440682 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:33.440838 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.440787 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:33.441166 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:33.441137 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:35.440839 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:35.440809 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:35.441282 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:35.440809 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:35.441282 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:35.440910 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:35.441282 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:35.440997 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:37.441487 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:37.441461 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:37.441862 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:37.441463 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:37.441862 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:37.441562 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:37.441862 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:37.441622 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:37.441991 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:37.441914 2560 scope.go:117] "RemoveContainer" containerID="d883a9ca304a7a3c7115f428d33ac1de550a2326595ec262d9a96659becf12ff" Apr 24 21:17:38.035254 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:38.035220 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:17:39.440840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:39.440815 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:39.441248 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:39.440821 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:39.441248 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:39.440942 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:39.441248 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:39.441023 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:41.203358 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:41.203322 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:41.203954 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.203494 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:41.203954 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.203578 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs podName:23cacaf9-66cf-483e-89f1-70f1b4c3cc3c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:57.203555553 +0000 UTC m=+65.262343216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs") pod "network-metrics-daemon-ldwdd" (UID: "23cacaf9-66cf-483e-89f1-70f1b4c3cc3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:41.304479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:41.304439 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:41.304632 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.304594 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:41.304632 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.304615 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:41.304632 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.304627 2560 projected.go:194] Error preparing data for projected volume kube-api-access-6ldvv for pod openshift-network-diagnostics/network-check-target-qbhf5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:41.304779 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.304688 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv podName:68f2b2b0-7441-4845-8db7-2d2bdb770218 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:57.304669282 +0000 UTC m=+65.363456936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ldvv" (UniqueName: "kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv") pod "network-check-target-qbhf5" (UID: "68f2b2b0-7441-4845-8db7-2d2bdb770218") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:41.440845 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:41.440813 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:41.440845 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:41.440835 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:41.441095 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.440944 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:41.441095 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:41.441060 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:42.561722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.561476 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-df2mh" event={"ID":"d0fb6fb7-0535-4776-8fae-ef83f9bfdcca","Type":"ContainerStarted","Data":"013e5e222e4545ef6f5b8e7a37968ea6664600bd885fa2a41bc1ffe6ae39d2d2"} Apr 24 21:17:42.563057 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.563033 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="3e52c3cfca56d89e0a71c15db4d16b8ce2b7be1ddd348d7ca26456377a7ed7ed" exitCode=0 Apr 24 21:17:42.563143 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.563098 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"3e52c3cfca56d89e0a71c15db4d16b8ce2b7be1ddd348d7ca26456377a7ed7ed"} Apr 24 21:17:42.565425 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.564701 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r5z5s" event={"ID":"ce514c8a-dcef-4be4-9c59-0a9305bef822","Type":"ContainerStarted","Data":"19ac89c033e792d007f20e78d03c3b5969c7c53cacb115a8aa26ba6a6ef9ac7b"} Apr 24 21:17:42.566458 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.566432 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" event={"ID":"3ef140eb-72c8-462e-9469-9aa900c0be05","Type":"ContainerStarted","Data":"ba90cc536cd9834e0a2bd81a55fdbbdcb91bb8438ec0586a674d7eaf92e29824"} Apr 24 21:17:42.569895 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.569872 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570216 2560 generic.go:358] "Generic (PLEG): container finished" podID="260be1ca-939b-4b9e-9f93-078d2506aef0" containerID="959eaae7b8025ed6322d9839436f60ecbc156b453bea0583579bd46620d2122d" exitCode=1 Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570276 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"1906b74d4cf90d1f1fe9a05b688e96e90759b0a09bf242a3d59af96bc40c3102"} Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570297 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"5fd300ddac8813d607a5652e7514fa3d94cf676100a19c503000c8992c7c143f"} Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570310 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"4a12d02cd489c3ecbf0b33b2af60f478ba97e79844040147a690353201ba3e3e"} Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570322 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerDied","Data":"959eaae7b8025ed6322d9839436f60ecbc156b453bea0583579bd46620d2122d"} Apr 24 21:17:42.570377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.570339 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"64582fe7ae0f6eb473de8732133ad77bfdee378d5e0ca5ea5b4a52d3ba15f293"} Apr 24 21:17:42.571603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.571555 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t6vpk" event={"ID":"05f6587b-f5da-428a-8968-80f271212138","Type":"ContainerStarted","Data":"4ee870a27d9a472d744e7b1d2ea4084abccaa779c16666708608a72601255d1a"} Apr 24 21:17:42.573184 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.573156 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x6t58" event={"ID":"b4deb568-1692-4033-a2a6-45866b8c89db","Type":"ContainerStarted","Data":"97e16b353303ce5a78ad95b9b5f8991aecf09f749c772d71fcc67c97d3c1ad81"} Apr 24 21:17:42.575477 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.575419 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-97kf5" event={"ID":"a5ac4a85-e0fb-4193-bc28-23442097690b","Type":"ContainerStarted","Data":"c3d81d7a9955a3ffbb73683ea18291e947c48108b3f3d105ddff4a041c4b4743"} Apr 24 21:17:42.576700 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.576679 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nk67g" event={"ID":"076d7d40-dc29-4871-a077-5e08e9154463","Type":"ContainerStarted","Data":"8818c233fa479cca4534f7447838d8c289c16c2a2177ffeb95e39ef64f84bf1c"} Apr 24 21:17:42.579793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.579541 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:17:42.580622 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.580497 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" event={"ID":"784ad9f5e87f1e75291c25fc06106e5e","Type":"ContainerStarted","Data":"9dba62479e698bed275e1c1e3f9228f782af2a2e3d565f1fbc39435825023f83"} Apr 24 21:17:42.630617 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.630577 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-df2mh" podStartSLOduration=7.247153993 podStartE2EDuration="23.630564416s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.666865698 +0000 UTC m=+33.725653351" lastFinishedPulling="2026-04-24 21:17:42.050276123 +0000 UTC m=+50.109063774" observedRunningTime="2026-04-24 21:17:42.616130286 +0000 UTC m=+50.674917957" watchObservedRunningTime="2026-04-24 21:17:42.630564416 +0000 UTC m=+50.689352087" Apr 24 21:17:42.744391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.744350 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x6t58" podStartSLOduration=7.729741597 podStartE2EDuration="23.744336974s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.656095334 +0000 UTC m=+33.714882984" lastFinishedPulling="2026-04-24 21:17:41.670690703 +0000 UTC m=+49.729478361" observedRunningTime="2026-04-24 21:17:42.743933949 +0000 UTC m=+50.802721613" watchObservedRunningTime="2026-04-24 21:17:42.744336974 +0000 UTC m=+50.803124644" Apr 24 21:17:42.777332 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.777284 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t6vpk" podStartSLOduration=7.783074289 podStartE2EDuration="23.777270721s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.676882371 +0000 UTC m=+33.735670021" lastFinishedPulling="2026-04-24 21:17:41.671078804 +0000 UTC m=+49.729866453" observedRunningTime="2026-04-24 21:17:42.777132235 +0000 UTC m=+50.835919905" watchObservedRunningTime="2026-04-24 21:17:42.777270721 +0000 UTC m=+50.836058391" Apr 24 21:17:42.804421 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.804383 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal" podStartSLOduration=17.804367471 podStartE2EDuration="17.804367471s" podCreationTimestamp="2026-04-24 21:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:42.803802308 +0000 UTC m=+50.862589978" watchObservedRunningTime="2026-04-24 21:17:42.804367471 +0000 UTC m=+50.863155141" Apr 24 21:17:42.830263 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.830221 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-97kf5" podStartSLOduration=7.471027217 podStartE2EDuration="23.830205886s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.632787605 +0000 UTC m=+33.691575258" lastFinishedPulling="2026-04-24 21:17:41.991966266 +0000 UTC m=+50.050753927" observedRunningTime="2026-04-24 21:17:42.8297822 +0000 UTC m=+50.888569871" watchObservedRunningTime="2026-04-24 21:17:42.830205886 +0000 UTC m=+50.888993556" Apr 24 21:17:42.868798 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:42.868748 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r5z5s" podStartSLOduration=12.057069749 podStartE2EDuration="23.868733743s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.651547383 +0000 UTC m=+33.710335032" lastFinishedPulling="2026-04-24 21:17:37.463211359 +0000 UTC m=+45.521999026" observedRunningTime="2026-04-24 21:17:42.868423656 +0000 UTC m=+50.927211329" watchObservedRunningTime="2026-04-24 21:17:42.868733743 +0000 UTC m=+50.927521413" Apr 24 21:17:43.250482 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.250457 2560 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:17:43.393058 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.392883 2560 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:17:43.250478896Z","UUID":"e5795a48-6d34-46ad-9bcf-838f454b39c8","Handler":null,"Name":"","Endpoint":""} Apr 24 21:17:43.394905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.394881 2560 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:17:43.395052 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.394912 2560 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:17:43.441107 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.441082 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:43.441248 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:43.441225 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:43.441292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.441276 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:43.441431 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:43.441405 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:43.583647 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.583610 2560 generic.go:358] "Generic (PLEG): container finished" podID="076d7d40-dc29-4871-a077-5e08e9154463" containerID="11d97487957271df8106b0600b365f66b65f1b6c0011d1859c0ee1671029953f" exitCode=0 Apr 24 21:17:43.584520 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.583699 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nk67g" event={"ID":"076d7d40-dc29-4871-a077-5e08e9154463","Type":"ContainerDied","Data":"11d97487957271df8106b0600b365f66b65f1b6c0011d1859c0ee1671029953f"} Apr 24 21:17:43.585265 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.585229 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q6gt8" event={"ID":"6159f82c-9148-4579-8103-7b0956bd6ce8","Type":"ContainerStarted","Data":"3298c5424d1af82d12460d6df8c3714b162578088574de1ecd635b77501eea1b"} Apr 24 21:17:43.587029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.587012 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" event={"ID":"3ef140eb-72c8-462e-9469-9aa900c0be05","Type":"ContainerStarted","Data":"194a75de36189008dc90e0731f5e89b92fbe490e74dbb949ee3f28823015275c"} Apr 24 21:17:43.589704 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.589682 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:17:43.590067 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.590043 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"f1444127de38d67342b9c392cdfcfc827bc083d8976f9bb4eb5288e980f8f320"} Apr 24 21:17:43.666230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:43.666177 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-q6gt8" podStartSLOduration=8.346888898 podStartE2EDuration="24.666157135s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.672435256 +0000 UTC m=+33.731222909" lastFinishedPulling="2026-04-24 21:17:41.991703495 +0000 UTC m=+50.050491146" observedRunningTime="2026-04-24 21:17:43.665193914 +0000 UTC m=+51.723981586" watchObservedRunningTime="2026-04-24 21:17:43.666157135 +0000 UTC m=+51.724944797" Apr 24 21:17:44.594675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:44.594421 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" event={"ID":"3ef140eb-72c8-462e-9469-9aa900c0be05","Type":"ContainerStarted","Data":"5b88747a241d069db07bee94e9a2c77a252e9e5085deee069a8fe6406436134d"} Apr 24 21:17:44.596802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:44.596730 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nk67g" event={"ID":"076d7d40-dc29-4871-a077-5e08e9154463","Type":"ContainerStarted","Data":"50f0b9f5ee367ae87083a3e659454b295a74ee132a27ae9c60b3b9f6bbf12375"} Apr 24 21:17:44.596802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:44.596765 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nk67g" event={"ID":"076d7d40-dc29-4871-a077-5e08e9154463","Type":"ContainerStarted","Data":"2b478cc721fdeb6ec2870eccd60e02e8d3f2f10f3f916e79a323eb1b0b5efa2c"} Apr 24 21:17:44.623104 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:44.623051 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2t4hs" podStartSLOduration=7.133861626 podStartE2EDuration="25.623034234s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.646031036 +0000 UTC m=+33.704818685" lastFinishedPulling="2026-04-24 21:17:44.135203643 +0000 UTC m=+52.193991293" observedRunningTime="2026-04-24 21:17:44.621718094 +0000 UTC m=+52.680505766" watchObservedRunningTime="2026-04-24 21:17:44.623034234 +0000 UTC m=+52.681821907" Apr 24 21:17:44.647758 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:44.647716 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nk67g" podStartSLOduration=16.935878425 podStartE2EDuration="17.647701109s" podCreationTimestamp="2026-04-24 21:17:27 +0000 UTC" firstStartedPulling="2026-04-24 21:17:42.009919129 +0000 UTC m=+50.068706778" lastFinishedPulling="2026-04-24 21:17:42.721741809 +0000 UTC m=+50.780529462" observedRunningTime="2026-04-24 21:17:44.646590128 +0000 UTC m=+52.705377831" watchObservedRunningTime="2026-04-24 21:17:44.647701109 +0000 UTC m=+52.706488789" Apr 24 21:17:45.441405 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.441371 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:45.441580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.441371 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:45.441580 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:45.441505 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:45.441580 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:45.441571 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:45.601880 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.601852 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:17:45.602291 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.602190 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"a1e1e49ef9da62755e4175404ff0758d130aa549ca543ba8d9e6e731b5bfdcff"} Apr 24 21:17:45.669337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.669311 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:45.669871 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:45.669846 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:46.604284 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:46.604252 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:46.604645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:46.604631 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t6vpk" Apr 24 21:17:47.440762 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.440603 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:47.440896 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.440603 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:47.440896 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:47.440826 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:47.440985 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:47.440909 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:47.607076 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.607043 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="4762f50596b8cb6bf27821ad35182ace4514ee0e0567a50302a51a147557ac78" exitCode=0 Apr 24 21:17:47.607503 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.607114 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"4762f50596b8cb6bf27821ad35182ace4514ee0e0567a50302a51a147557ac78"} Apr 24 21:17:47.610100 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.610085 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:17:47.610478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.610444 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"515df1d9da2d3788a61470d7207f3924f9b3cd6aecae265ddbadfbd547ec3e27"} Apr 24 21:17:47.610836 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.610821 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:47.610898 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.610845 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:47.610977 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.610962 2560 scope.go:117] "RemoveContainer" containerID="959eaae7b8025ed6322d9839436f60ecbc156b453bea0583579bd46620d2122d" Apr 24 21:17:47.625506 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:47.625490 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:48.617279 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:48.617255 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:17:48.617973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:48.617600 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" event={"ID":"260be1ca-939b-4b9e-9f93-078d2506aef0","Type":"ContainerStarted","Data":"793a9d9bc33285f60944fe17737ba9df64a9e6c5e9140a23c0b7e128425d84eb"} Apr 24 21:17:48.617973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:48.617777 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:48.630427 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:48.630406 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:17:48.665093 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:48.665045 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" podStartSLOduration=13.206907131 podStartE2EDuration="29.665034111s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.640972143 +0000 UTC m=+33.699759792" lastFinishedPulling="2026-04-24 21:17:42.099099107 +0000 UTC m=+50.157886772" observedRunningTime="2026-04-24 21:17:48.663182417 +0000 UTC m=+56.721970098" watchObservedRunningTime="2026-04-24 21:17:48.665034111 +0000 UTC m=+56.723821782" Apr 24 21:17:49.045021 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.044995 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ldwdd"] Apr 24 21:17:49.045150 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.045108 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:49.045203 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:49.045187 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:49.058355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.058335 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbhf5"] Apr 24 21:17:49.058444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.058417 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:49.058494 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:49.058478 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:49.621086 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.621053 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="a02536cd714068d8deb2778a7e2e74eb6bfcf7901b662b079598aa4307521c47" exitCode=0 Apr 24 21:17:49.621542 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:49.621141 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"a02536cd714068d8deb2778a7e2e74eb6bfcf7901b662b079598aa4307521c47"} Apr 24 21:17:50.441546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:50.441518 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:50.441676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:50.441563 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:50.441676 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:50.441645 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:50.441778 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:50.441755 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:51.627008 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:51.626808 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="c720f29fa66650fef327923e0fd1b9a152934cab2ae6f8d67e21160b8674ccba" exitCode=0 Apr 24 21:17:51.627403 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:51.626871 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"c720f29fa66650fef327923e0fd1b9a152934cab2ae6f8d67e21160b8674ccba"} Apr 24 21:17:52.441914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:52.441877 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:52.442165 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:52.441974 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:52.442165 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:52.442061 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:52.442359 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:52.442324 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:54.441618 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.441590 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:54.442032 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:54.441696 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbhf5" podUID="68f2b2b0-7441-4845-8db7-2d2bdb770218" Apr 24 21:17:54.442032 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.441732 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:54.442032 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:17:54.441820 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ldwdd" podUID="23cacaf9-66cf-483e-89f1-70f1b4c3cc3c" Apr 24 21:17:54.751679 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.751651 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-142.ec2.internal" event="NodeReady" Apr 24 21:17:54.751808 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.751791 2560 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:17:54.811917 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.811879 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gdwvv"] Apr 24 21:17:54.850855 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.850828 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5dm7q"] Apr 24 21:17:54.850994 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.850879 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:54.853530 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.853499 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:17:54.853639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.853551 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:17:54.853690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.853650 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rt6g8\"" Apr 24 21:17:54.853954 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.853938 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:17:54.871771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.871745 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdwvv"] Apr 24 21:17:54.871771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.871769 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5dm7q"] Apr 24 21:17:54.871918 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.871872 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:54.874230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.874208 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:54.874444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.874408 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:54.874698 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.874647 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:17:54.875727 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.874902 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:17:54.876159 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.876139 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bqlv4\"" Apr 24 21:17:54.903893 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.903874 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cx2qc"] Apr 24 21:17:54.915724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.915705 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc47cff2-ad26-4f59-b1d8-6831f76de599-cert\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:54.915828 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.915742 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpp7\" (UniqueName: \"kubernetes.io/projected/cc47cff2-ad26-4f59-b1d8-6831f76de599-kube-api-access-gwpp7\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:54.921330 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.921312 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cx2qc"] Apr 24 21:17:54.921451 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.921434 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:54.923648 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.923633 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vh4z7\"" Apr 24 21:17:54.923734 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.923714 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:17:54.923805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:54.923791 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:17:55.017086 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017055 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpp7\" (UniqueName: \"kubernetes.io/projected/cc47cff2-ad26-4f59-b1d8-6831f76de599-kube-api-access-gwpp7\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:55.017240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017094 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.017240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017120 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spr5m\" (UniqueName: \"kubernetes.io/projected/a282a503-61d3-4d88-bb58-1c4989fe6bd8-kube-api-access-spr5m\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.017240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017153 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6rj\" (UniqueName: \"kubernetes.io/projected/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-api-access-8s6rj\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.017240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017215 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a282a503-61d3-4d88-bb58-1c4989fe6bd8-tmp-dir\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.017444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017257 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a282a503-61d3-4d88-bb58-1c4989fe6bd8-metrics-tls\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.017444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017298 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d07634f-dd2f-4257-8023-4cc8ec27fae2-crio-socket\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.017444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017339 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d07634f-dd2f-4257-8023-4cc8ec27fae2-data-volume\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.017444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017395 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc47cff2-ad26-4f59-b1d8-6831f76de599-cert\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:55.017444 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017430 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d07634f-dd2f-4257-8023-4cc8ec27fae2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.017659 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.017465 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a282a503-61d3-4d88-bb58-1c4989fe6bd8-config-volume\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.021974 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.021811 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc47cff2-ad26-4f59-b1d8-6831f76de599-cert\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:55.026313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.026290 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpp7\" (UniqueName: \"kubernetes.io/projected/cc47cff2-ad26-4f59-b1d8-6831f76de599-kube-api-access-gwpp7\") pod \"ingress-canary-gdwvv\" (UID: \"cc47cff2-ad26-4f59-b1d8-6831f76de599\") " pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:55.118184 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118155 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d07634f-dd2f-4257-8023-4cc8ec27fae2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118318 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118196 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a282a503-61d3-4d88-bb58-1c4989fe6bd8-config-volume\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.118318 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118243 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118318 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118268 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spr5m\" (UniqueName: \"kubernetes.io/projected/a282a503-61d3-4d88-bb58-1c4989fe6bd8-kube-api-access-spr5m\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.118318 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118296 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6rj\" (UniqueName: \"kubernetes.io/projected/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-api-access-8s6rj\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118320 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a282a503-61d3-4d88-bb58-1c4989fe6bd8-tmp-dir\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.118515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118405 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a282a503-61d3-4d88-bb58-1c4989fe6bd8-metrics-tls\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.118515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118460 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d07634f-dd2f-4257-8023-4cc8ec27fae2-crio-socket\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118505 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d07634f-dd2f-4257-8023-4cc8ec27fae2-data-volume\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118739 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d07634f-dd2f-4257-8023-4cc8ec27fae2-crio-socket\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.118865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118844 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a282a503-61d3-4d88-bb58-1c4989fe6bd8-config-volume\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.118865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.118854 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a282a503-61d3-4d88-bb58-1c4989fe6bd8-tmp-dir\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.119110 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.119089 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d07634f-dd2f-4257-8023-4cc8ec27fae2-data-volume\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.119275 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.119256 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.120844 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.120813 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a282a503-61d3-4d88-bb58-1c4989fe6bd8-metrics-tls\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.120942 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.120843 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d07634f-dd2f-4257-8023-4cc8ec27fae2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.127115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.127095 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6rj\" (UniqueName: \"kubernetes.io/projected/6d07634f-dd2f-4257-8023-4cc8ec27fae2-kube-api-access-8s6rj\") pod \"insights-runtime-extractor-5dm7q\" (UID: \"6d07634f-dd2f-4257-8023-4cc8ec27fae2\") " pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.127188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.127145 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spr5m\" (UniqueName: \"kubernetes.io/projected/a282a503-61d3-4d88-bb58-1c4989fe6bd8-kube-api-access-spr5m\") pod \"dns-default-cx2qc\" (UID: \"a282a503-61d3-4d88-bb58-1c4989fe6bd8\") " pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:55.160026 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.160002 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdwvv" Apr 24 21:17:55.181967 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.181943 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5dm7q" Apr 24 21:17:55.229762 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:55.229742 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cx2qc" Apr 24 21:17:56.441084 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.441052 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:56.441524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.441065 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:56.446555 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.446509 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:56.446682 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.446596 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dd9qw\"" Apr 24 21:17:56.446682 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.446632 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:56.446682 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.446657 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8v42p\"" Apr 24 21:17:56.446851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:56.446777 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:57.235856 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.235828 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:57.238211 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.238187 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23cacaf9-66cf-483e-89f1-70f1b4c3cc3c-metrics-certs\") pod \"network-metrics-daemon-ldwdd\" (UID: \"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c\") " pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:57.246209 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.246190 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cx2qc"] Apr 24 21:17:57.250092 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.250073 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdwvv"] Apr 24 21:17:57.250797 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.250777 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5dm7q"] Apr 24 21:17:57.326491 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:57.326432 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda282a503_61d3_4d88_bb58_1c4989fe6bd8.slice/crio-a879a5a796f324be0af0a6d1a1f4db3014266cd41ea278ea35f25f59299e2e80 WatchSource:0}: Error finding container a879a5a796f324be0af0a6d1a1f4db3014266cd41ea278ea35f25f59299e2e80: Status 404 returned error can't find the container with id a879a5a796f324be0af0a6d1a1f4db3014266cd41ea278ea35f25f59299e2e80 Apr 24 21:17:57.326711 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:57.326690 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d07634f_dd2f_4257_8023_4cc8ec27fae2.slice/crio-f6e2d14f26786b4d8e1b334435178c14fb3b725aff97a9ce88b197aeb5712034 WatchSource:0}: Error finding container f6e2d14f26786b4d8e1b334435178c14fb3b725aff97a9ce88b197aeb5712034: Status 404 returned error can't find the container with id f6e2d14f26786b4d8e1b334435178c14fb3b725aff97a9ce88b197aeb5712034 Apr 24 21:17:57.327078 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:57.327055 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc47cff2_ad26_4f59_b1d8_6831f76de599.slice/crio-b22a8b62f796ad61233092a2081809b4b728642196a753e9af08d2101b778b03 WatchSource:0}: Error finding container b22a8b62f796ad61233092a2081809b4b728642196a753e9af08d2101b778b03: Status 404 returned error can't find the container with id b22a8b62f796ad61233092a2081809b4b728642196a753e9af08d2101b778b03 Apr 24 21:17:57.336565 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.336547 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:57.338655 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.338630 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldvv\" (UniqueName: \"kubernetes.io/projected/68f2b2b0-7441-4845-8db7-2d2bdb770218-kube-api-access-6ldvv\") pod \"network-check-target-qbhf5\" (UID: \"68f2b2b0-7441-4845-8db7-2d2bdb770218\") " pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:57.352375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.352355 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ldwdd" Apr 24 21:17:57.358143 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.358128 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:17:57.484348 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.484078 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ldwdd"] Apr 24 21:17:57.487964 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:57.487905 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23cacaf9_66cf_483e_89f1_70f1b4c3cc3c.slice/crio-05d69e96855b61b24c3f0655074ae7d3bb37984b248ab784445424776588366d WatchSource:0}: Error finding container 05d69e96855b61b24c3f0655074ae7d3bb37984b248ab784445424776588366d: Status 404 returned error can't find the container with id 05d69e96855b61b24c3f0655074ae7d3bb37984b248ab784445424776588366d Apr 24 21:17:57.495704 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.495677 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbhf5"] Apr 24 21:17:57.499214 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:17:57.499191 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f2b2b0_7441_4845_8db7_2d2bdb770218.slice/crio-40453dd968605d11b37d031c3586c6e224527f6f4825964ccff4658dd454d655 WatchSource:0}: Error finding container 40453dd968605d11b37d031c3586c6e224527f6f4825964ccff4658dd454d655: Status 404 returned error can't find the container with id 40453dd968605d11b37d031c3586c6e224527f6f4825964ccff4658dd454d655 Apr 24 21:17:57.640459 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.640347 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ldwdd" event={"ID":"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c","Type":"ContainerStarted","Data":"05d69e96855b61b24c3f0655074ae7d3bb37984b248ab784445424776588366d"} Apr 24 21:17:57.643045 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.643016 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerStarted","Data":"63c6b9027d3e9d40867234fd3ad68a904343a5a93da15752c0274c0ceffde25a"} Apr 24 21:17:57.644087 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.644066 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdwvv" event={"ID":"cc47cff2-ad26-4f59-b1d8-6831f76de599","Type":"ContainerStarted","Data":"b22a8b62f796ad61233092a2081809b4b728642196a753e9af08d2101b778b03"} Apr 24 21:17:57.645095 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.645075 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbhf5" event={"ID":"68f2b2b0-7441-4845-8db7-2d2bdb770218","Type":"ContainerStarted","Data":"40453dd968605d11b37d031c3586c6e224527f6f4825964ccff4658dd454d655"} Apr 24 21:17:57.646043 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.646021 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cx2qc" event={"ID":"a282a503-61d3-4d88-bb58-1c4989fe6bd8","Type":"ContainerStarted","Data":"a879a5a796f324be0af0a6d1a1f4db3014266cd41ea278ea35f25f59299e2e80"} Apr 24 21:17:57.647390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.647371 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5dm7q" event={"ID":"6d07634f-dd2f-4257-8023-4cc8ec27fae2","Type":"ContainerStarted","Data":"2b1aaeec311d1d98e126615f56adcd53feb70509d8d4922981fa247e9cc56d3e"} Apr 24 21:17:57.647390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:57.647392 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5dm7q" event={"ID":"6d07634f-dd2f-4257-8023-4cc8ec27fae2","Type":"ContainerStarted","Data":"f6e2d14f26786b4d8e1b334435178c14fb3b725aff97a9ce88b197aeb5712034"} Apr 24 21:17:58.653871 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:58.653795 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="63c6b9027d3e9d40867234fd3ad68a904343a5a93da15752c0274c0ceffde25a" exitCode=0 Apr 24 21:17:58.653871 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:17:58.653856 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"63c6b9027d3e9d40867234fd3ad68a904343a5a93da15752c0274c0ceffde25a"} Apr 24 21:18:01.663775 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:01.663742 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerStarted","Data":"5ce4a1834c4e1b304f57757fe21031ad92734d75bd8e72cabe2e17b86b1e3d74"} Apr 24 21:18:01.665156 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:01.665129 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdwvv" event={"ID":"cc47cff2-ad26-4f59-b1d8-6831f76de599","Type":"ContainerStarted","Data":"3b981aec1aeb69a55d6d8d2cb8bb9a3b505eb0c226c9c7d0cd857a6b3187bd4b"} Apr 24 21:18:01.688734 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:01.688671 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gdwvv" podStartSLOduration=3.5385436439999998 podStartE2EDuration="7.688651567s" podCreationTimestamp="2026-04-24 21:17:54 +0000 UTC" firstStartedPulling="2026-04-24 21:17:57.346352074 +0000 UTC m=+65.405139730" lastFinishedPulling="2026-04-24 21:18:01.496459988 +0000 UTC m=+69.555247653" observedRunningTime="2026-04-24 21:18:01.687289174 +0000 UTC m=+69.746076859" watchObservedRunningTime="2026-04-24 21:18:01.688651567 +0000 UTC m=+69.747439240" Apr 24 21:18:02.669999 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.669961 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ldwdd" event={"ID":"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c","Type":"ContainerStarted","Data":"ab07343fd897c8256899ce926c7d096c662fcdb7aaa43be26e07abf05e418aa4"} Apr 24 21:18:02.669999 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.670004 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ldwdd" event={"ID":"23cacaf9-66cf-483e-89f1-70f1b4c3cc3c","Type":"ContainerStarted","Data":"777e83e6cf14d16801978767af69b5688dbf5d8523b71c16c578703ce998a4c9"} Apr 24 21:18:02.676365 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.676334 2560 generic.go:358] "Generic (PLEG): container finished" podID="849dc6e4-06dd-4834-a5eb-de6ceebd649f" containerID="5ce4a1834c4e1b304f57757fe21031ad92734d75bd8e72cabe2e17b86b1e3d74" exitCode=0 Apr 24 21:18:02.676489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.676417 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerDied","Data":"5ce4a1834c4e1b304f57757fe21031ad92734d75bd8e72cabe2e17b86b1e3d74"} Apr 24 21:18:02.678168 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.678143 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbhf5" event={"ID":"68f2b2b0-7441-4845-8db7-2d2bdb770218","Type":"ContainerStarted","Data":"e280873e13342089643569699721df94028c87a2ae00bc6cf770c76aad1690c3"} Apr 24 21:18:02.678328 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.678284 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:18:02.679940 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.679897 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cx2qc" event={"ID":"a282a503-61d3-4d88-bb58-1c4989fe6bd8","Type":"ContainerStarted","Data":"ffe2e87eb771641c5b90a824c76c04184e58d2c0e42ab5a0a532ee587741af58"} Apr 24 21:18:02.680044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.679944 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cx2qc" event={"ID":"a282a503-61d3-4d88-bb58-1c4989fe6bd8","Type":"ContainerStarted","Data":"f0f87941a0a8c9b0189bc1c1b8ec41ff64f38b37be09ff296abbacced3c08042"} Apr 24 21:18:02.680105 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.680050 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cx2qc" Apr 24 21:18:02.681489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.681469 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5dm7q" event={"ID":"6d07634f-dd2f-4257-8023-4cc8ec27fae2","Type":"ContainerStarted","Data":"706f8f3ced31808da7697dc23dffdb70412dc843e05429bfa1a46f5e437fb310"} Apr 24 21:18:02.691605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.691567 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ldwdd" podStartSLOduration=39.68426666 podStartE2EDuration="43.69155533s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:57.49153383 +0000 UTC m=+65.550321482" lastFinishedPulling="2026-04-24 21:18:01.498822492 +0000 UTC m=+69.557610152" observedRunningTime="2026-04-24 21:18:02.689961911 +0000 UTC m=+70.748749585" watchObservedRunningTime="2026-04-24 21:18:02.69155533 +0000 UTC m=+70.750343001" Apr 24 21:18:02.707321 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.707252 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qbhf5" podStartSLOduration=39.368253904 podStartE2EDuration="43.707240653s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:57.500826522 +0000 UTC m=+65.559614171" lastFinishedPulling="2026-04-24 21:18:01.83981327 +0000 UTC m=+69.898600920" observedRunningTime="2026-04-24 21:18:02.706204718 +0000 UTC m=+70.764992392" watchObservedRunningTime="2026-04-24 21:18:02.707240653 +0000 UTC m=+70.766028325" Apr 24 21:18:02.725048 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:02.724861 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cx2qc" podStartSLOduration=4.575290778 podStartE2EDuration="8.724844655s" podCreationTimestamp="2026-04-24 21:17:54 +0000 UTC" firstStartedPulling="2026-04-24 21:17:57.346351328 +0000 UTC m=+65.405138977" lastFinishedPulling="2026-04-24 21:18:01.495905199 +0000 UTC m=+69.554692854" observedRunningTime="2026-04-24 21:18:02.723268396 +0000 UTC m=+70.782056068" watchObservedRunningTime="2026-04-24 21:18:02.724844655 +0000 UTC m=+70.783632328" Apr 24 21:18:03.688147 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:03.688116 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" event={"ID":"849dc6e4-06dd-4834-a5eb-de6ceebd649f","Type":"ContainerStarted","Data":"820ff5ff8e15b7a22d818a9966ef498eecb5cfdf7a08ebf6e4db0a7ace7e9dce"} Apr 24 21:18:03.689872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:03.689841 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5dm7q" event={"ID":"6d07634f-dd2f-4257-8023-4cc8ec27fae2","Type":"ContainerStarted","Data":"ecc4bc7c3fe7452001929096937401e0bc706fbd23cc174211fcbd8d559125a0"} Apr 24 21:18:03.731332 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:03.731292 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kqxwj" podStartSLOduration=13.020207911 podStartE2EDuration="44.731280215s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.661606256 +0000 UTC m=+33.720393905" lastFinishedPulling="2026-04-24 21:17:57.372678553 +0000 UTC m=+65.431466209" observedRunningTime="2026-04-24 21:18:03.729461385 +0000 UTC m=+71.788249081" watchObservedRunningTime="2026-04-24 21:18:03.731280215 +0000 UTC m=+71.790067887" Apr 24 21:18:03.768785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:03.768732 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5dm7q" podStartSLOduration=3.612746917 podStartE2EDuration="9.76871619s" podCreationTimestamp="2026-04-24 21:17:54 +0000 UTC" firstStartedPulling="2026-04-24 21:17:57.441124888 +0000 UTC m=+65.499912538" lastFinishedPulling="2026-04-24 21:18:03.597094161 +0000 UTC m=+71.655881811" observedRunningTime="2026-04-24 21:18:03.764984414 +0000 UTC m=+71.823772088" watchObservedRunningTime="2026-04-24 21:18:03.76871619 +0000 UTC m=+71.827503861" Apr 24 21:18:12.692277 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:12.692167 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cx2qc" Apr 24 21:18:20.633994 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:20.633967 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rbsp" Apr 24 21:18:33.691720 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:18:33.691695 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qbhf5" Apr 24 21:19:41.413482 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.413448 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lgtm8"] Apr 24 21:19:41.415122 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.415106 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.417474 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.417455 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:19:41.423277 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.423253 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lgtm8"] Apr 24 21:19:41.468995 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.468946 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-kubelet-config\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.468995 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.468979 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-original-pull-secret\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.469113 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.469004 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-dbus\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.569960 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.569916 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-original-pull-secret\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.570073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.569979 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-dbus\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.570073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.570032 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-kubelet-config\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.570169 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.570112 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-kubelet-config\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.570169 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.570149 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-dbus\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.573729 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.573715 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f7f49cd-0b2f-4225-a283-3ee2a6d74934-original-pull-secret\") pod \"global-pull-secret-syncer-lgtm8\" (UID: \"8f7f49cd-0b2f-4225-a283-3ee2a6d74934\") " pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.724460 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.724407 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lgtm8" Apr 24 21:19:41.832568 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.832540 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lgtm8"] Apr 24 21:19:41.835496 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:19:41.835470 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f7f49cd_0b2f_4225_a283_3ee2a6d74934.slice/crio-96d00e790e9c8c4504c471977a002d27d6299aed7e78f7598a1b0d26cf398043 WatchSource:0}: Error finding container 96d00e790e9c8c4504c471977a002d27d6299aed7e78f7598a1b0d26cf398043: Status 404 returned error can't find the container with id 96d00e790e9c8c4504c471977a002d27d6299aed7e78f7598a1b0d26cf398043 Apr 24 21:19:41.927615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:41.927581 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lgtm8" event={"ID":"8f7f49cd-0b2f-4225-a283-3ee2a6d74934","Type":"ContainerStarted","Data":"96d00e790e9c8c4504c471977a002d27d6299aed7e78f7598a1b0d26cf398043"} Apr 24 21:19:46.942073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:46.942032 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lgtm8" event={"ID":"8f7f49cd-0b2f-4225-a283-3ee2a6d74934","Type":"ContainerStarted","Data":"4b966e2666639f76d8d7eb3669dc1f4da3cd3698bb604c4f3c3d89c4a5af798c"} Apr 24 21:19:46.956550 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:19:46.956503 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lgtm8" podStartSLOduration=1.9081287310000001 podStartE2EDuration="5.956489308s" podCreationTimestamp="2026-04-24 21:19:41 +0000 UTC" firstStartedPulling="2026-04-24 21:19:41.840586458 +0000 UTC m=+169.899374107" lastFinishedPulling="2026-04-24 21:19:45.888947023 +0000 UTC m=+173.947734684" observedRunningTime="2026-04-24 21:19:46.955683239 +0000 UTC m=+175.014470911" watchObservedRunningTime="2026-04-24 21:19:46.956489308 +0000 UTC m=+175.015276980" Apr 24 21:20:02.538050 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.538022 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m"] Apr 24 21:20:02.539824 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.539809 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.542173 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.542154 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:20:02.542761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.542741 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:20:02.542884 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.542768 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:20:02.553080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.553054 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m"] Apr 24 21:20:02.608022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.607990 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9mf8\" (UniqueName: \"kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.608161 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.608036 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.608161 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.608090 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.708965 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.708916 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9mf8\" (UniqueName: \"kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.709096 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.708978 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.709096 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.709012 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.709309 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.709292 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.709818 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.709804 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.717422 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.717395 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9mf8\" (UniqueName: \"kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.848030 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.847948 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:02.968011 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.967984 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m"] Apr 24 21:20:02.971830 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:20:02.971804 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de646fd_9094_4a71_9e5d_0f11d66ff654.slice/crio-b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562 WatchSource:0}: Error finding container b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562: Status 404 returned error can't find the container with id b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562 Apr 24 21:20:02.979697 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:02.979675 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" event={"ID":"2de646fd-9094-4a71-9e5d-0f11d66ff654","Type":"ContainerStarted","Data":"b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562"} Apr 24 21:20:11.000485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:11.000449 2560 generic.go:358] "Generic (PLEG): container finished" podID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerID="b0a31a1e5462d8f251260caf9f13e8310f4864586cb64ea7c784dca5e8952fa7" exitCode=0 Apr 24 21:20:11.000854 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:11.000514 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" event={"ID":"2de646fd-9094-4a71-9e5d-0f11d66ff654","Type":"ContainerDied","Data":"b0a31a1e5462d8f251260caf9f13e8310f4864586cb64ea7c784dca5e8952fa7"} Apr 24 21:20:14.009882 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:14.009845 2560 generic.go:358] "Generic (PLEG): container finished" podID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerID="e21493683e476098618ef454ddb8ee45d27dd0640ca057ebb973737812ce020c" exitCode=0 Apr 24 21:20:14.010262 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:14.009915 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" event={"ID":"2de646fd-9094-4a71-9e5d-0f11d66ff654","Type":"ContainerDied","Data":"e21493683e476098618ef454ddb8ee45d27dd0640ca057ebb973737812ce020c"} Apr 24 21:20:23.040320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:23.040278 2560 generic.go:358] "Generic (PLEG): container finished" podID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerID="45692109b5626d39eec45ac2aabffc6ab953eb9f0faa36c31b5d3bbb5321212e" exitCode=0 Apr 24 21:20:23.040723 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:23.040356 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" event={"ID":"2de646fd-9094-4a71-9e5d-0f11d66ff654","Type":"ContainerDied","Data":"45692109b5626d39eec45ac2aabffc6ab953eb9f0faa36c31b5d3bbb5321212e"} Apr 24 21:20:24.155117 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.155095 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:24.259082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.259044 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9mf8\" (UniqueName: \"kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8\") pod \"2de646fd-9094-4a71-9e5d-0f11d66ff654\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " Apr 24 21:20:24.259082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.259087 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util\") pod \"2de646fd-9094-4a71-9e5d-0f11d66ff654\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " Apr 24 21:20:24.259284 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.259126 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle\") pod \"2de646fd-9094-4a71-9e5d-0f11d66ff654\" (UID: \"2de646fd-9094-4a71-9e5d-0f11d66ff654\") " Apr 24 21:20:24.259787 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.259755 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle" (OuterVolumeSpecName: "bundle") pod "2de646fd-9094-4a71-9e5d-0f11d66ff654" (UID: "2de646fd-9094-4a71-9e5d-0f11d66ff654"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:20:24.261237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.261214 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8" (OuterVolumeSpecName: "kube-api-access-s9mf8") pod "2de646fd-9094-4a71-9e5d-0f11d66ff654" (UID: "2de646fd-9094-4a71-9e5d-0f11d66ff654"). InnerVolumeSpecName "kube-api-access-s9mf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:20:24.264292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.264263 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util" (OuterVolumeSpecName: "util") pod "2de646fd-9094-4a71-9e5d-0f11d66ff654" (UID: "2de646fd-9094-4a71-9e5d-0f11d66ff654"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:20:24.360015 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.359912 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:20:24.360015 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.359959 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9mf8\" (UniqueName: \"kubernetes.io/projected/2de646fd-9094-4a71-9e5d-0f11d66ff654-kube-api-access-s9mf8\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:20:24.360015 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:24.359970 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2de646fd-9094-4a71-9e5d-0f11d66ff654-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:20:25.046724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:25.046691 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" event={"ID":"2de646fd-9094-4a71-9e5d-0f11d66ff654","Type":"ContainerDied","Data":"b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562"} Apr 24 21:20:25.046724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:25.046712 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwww2m" Apr 24 21:20:25.046724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:25.046721 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2daceab51626ab8c7df694b78d297e95e8ff1f0b427b184f2e5b88cd0c22562" Apr 24 21:20:29.406180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406145 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd"] Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406344 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="pull" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406354 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="pull" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406368 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="util" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406373 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="util" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406379 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="extract" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406384 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="extract" Apr 24 21:20:29.406597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.406423 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="2de646fd-9094-4a71-9e5d-0f11d66ff654" containerName="extract" Apr 24 21:20:29.409475 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.409454 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.411626 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.411594 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:20:29.411626 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.411617 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:20:29.411870 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.411855 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:20:29.411982 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.411966 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-bdjn7\"" Apr 24 21:20:29.419810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.419788 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd"] Apr 24 21:20:29.494965 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.494901 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pb86\" (UniqueName: \"kubernetes.io/projected/d7658512-d542-4d27-b1d6-df7b1bb7f154-kube-api-access-7pb86\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.495129 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.494979 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d7658512-d542-4d27-b1d6-df7b1bb7f154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.595250 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.595221 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d7658512-d542-4d27-b1d6-df7b1bb7f154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.595398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.595272 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pb86\" (UniqueName: \"kubernetes.io/projected/d7658512-d542-4d27-b1d6-df7b1bb7f154-kube-api-access-7pb86\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.597542 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.597522 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d7658512-d542-4d27-b1d6-df7b1bb7f154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.604246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.604216 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pb86\" (UniqueName: \"kubernetes.io/projected/d7658512-d542-4d27-b1d6-df7b1bb7f154-kube-api-access-7pb86\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd\" (UID: \"d7658512-d542-4d27-b1d6-df7b1bb7f154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.719326 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.719244 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:29.833746 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:29.833715 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd"] Apr 24 21:20:29.837455 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:20:29.837423 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7658512_d542_4d27_b1d6_df7b1bb7f154.slice/crio-f31c24e82f667e076c68f8e1405e2d962679bec72e020cb08a95d7c250c635cf WatchSource:0}: Error finding container f31c24e82f667e076c68f8e1405e2d962679bec72e020cb08a95d7c250c635cf: Status 404 returned error can't find the container with id f31c24e82f667e076c68f8e1405e2d962679bec72e020cb08a95d7c250c635cf Apr 24 21:20:30.060445 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:30.060407 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" event={"ID":"d7658512-d542-4d27-b1d6-df7b1bb7f154","Type":"ContainerStarted","Data":"f31c24e82f667e076c68f8e1405e2d962679bec72e020cb08a95d7c250c635cf"} Apr 24 21:20:33.069516 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.069482 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" event={"ID":"d7658512-d542-4d27-b1d6-df7b1bb7f154","Type":"ContainerStarted","Data":"975a03b33b0c65c6808be304b291109b84a485c31106ef870b637901a56fad42"} Apr 24 21:20:33.070004 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.069629 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:33.088416 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.088377 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" podStartSLOduration=0.999142931 podStartE2EDuration="4.088361992s" podCreationTimestamp="2026-04-24 21:20:29 +0000 UTC" firstStartedPulling="2026-04-24 21:20:29.83908583 +0000 UTC m=+217.897873482" lastFinishedPulling="2026-04-24 21:20:32.928304878 +0000 UTC m=+220.987092543" observedRunningTime="2026-04-24 21:20:33.087476193 +0000 UTC m=+221.146263865" watchObservedRunningTime="2026-04-24 21:20:33.088361992 +0000 UTC m=+221.147149665" Apr 24 21:20:33.462740 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.462708 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c6xjz"] Apr 24 21:20:33.465733 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.465714 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.468072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.468052 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:20:33.468262 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.468249 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-qgqnx\"" Apr 24 21:20:33.468439 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.468426 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:20:33.474761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.474742 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c6xjz"] Apr 24 21:20:33.524658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.524628 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgr2\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-kube-api-access-6hgr2\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.524658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.524664 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.524843 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.524759 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df108736-62fe-468e-8409-3ffd1cb43473-cabundle0\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.625556 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.625521 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgr2\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-kube-api-access-6hgr2\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.625556 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.625560 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.625747 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.625619 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df108736-62fe-468e-8409-3ffd1cb43473-cabundle0\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.625747 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.625728 2560 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:20:33.625810 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.625751 2560 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:33.625810 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.625759 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:33.625810 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.625770 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c6xjz: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:20:33.625899 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.625823 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates podName:df108736-62fe-468e-8409-3ffd1cb43473 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:34.125806953 +0000 UTC m=+222.184594602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates") pod "keda-operator-ffbb595cb-c6xjz" (UID: "df108736-62fe-468e-8409-3ffd1cb43473") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:20:33.626391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.626371 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df108736-62fe-468e-8409-3ffd1cb43473-cabundle0\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.636242 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.636217 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgr2\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-kube-api-access-6hgr2\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:33.776989 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.776955 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z"] Apr 24 21:20:33.780111 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.780096 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.782971 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.782952 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:20:33.789122 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.789101 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z"] Apr 24 21:20:33.827223 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.827201 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.827352 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.827240 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2mt\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-kube-api-access-rg2mt\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.827352 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.827288 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.927986 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.927958 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.928138 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.928007 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2mt\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-kube-api-access-rg2mt\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.928138 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.928101 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.928251 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.928195 2560 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:33.928251 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.928207 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:33.928251 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.928223 2560 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 21:20:33.928251 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.928243 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:20:33.928446 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:33.928303 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates podName:ab65412c-a5f8-4c9b-8fc1-8d643ec9e337 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:34.428285198 +0000 UTC m=+222.487072847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates") pod "keda-metrics-apiserver-7c9f485588-92n7z" (UID: "ab65412c-a5f8-4c9b-8fc1-8d643ec9e337") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:20:33.928573 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.928547 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:33.936453 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:33.936430 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2mt\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-kube-api-access-rg2mt\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:34.004342 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.004311 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-ldftr"] Apr 24 21:20:34.007323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.007308 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.010209 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.010192 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:20:34.021388 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.021365 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ldftr"] Apr 24 21:20:34.130198 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.130116 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:34.130198 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.130169 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.130593 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.130278 2560 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:34.130593 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.130298 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:34.130593 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.130309 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c6xjz: references non-existent secret key: ca.crt Apr 24 21:20:34.130593 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.130372 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates podName:df108736-62fe-468e-8409-3ffd1cb43473 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:35.130352137 +0000 UTC m=+223.189139795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates") pod "keda-operator-ffbb595cb-c6xjz" (UID: "df108736-62fe-468e-8409-3ffd1cb43473") : references non-existent secret key: ca.crt Apr 24 21:20:34.130593 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.130305 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmw7\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-kube-api-access-5pmw7\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.230680 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.230652 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmw7\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-kube-api-access-5pmw7\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.230808 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.230712 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.230808 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.230800 2560 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:20:34.230891 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.230818 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-ldftr: secret "keda-admission-webhooks-certs" not found Apr 24 21:20:34.230933 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.230893 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates podName:aee887d1-e495-4cf8-b0e9-f00462766eef nodeName:}" failed. No retries permitted until 2026-04-24 21:20:34.730876683 +0000 UTC m=+222.789664335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates") pod "keda-admission-cf49989db-ldftr" (UID: "aee887d1-e495-4cf8-b0e9-f00462766eef") : secret "keda-admission-webhooks-certs" not found Apr 24 21:20:34.242079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.242056 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmw7\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-kube-api-access-5pmw7\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.432013 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.431910 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:34.432182 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.432078 2560 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:34.432182 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.432101 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:34.432182 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.432124 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z: references non-existent secret key: tls.crt Apr 24 21:20:34.432333 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:34.432189 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates podName:ab65412c-a5f8-4c9b-8fc1-8d643ec9e337 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:35.432170669 +0000 UTC m=+223.490958325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates") pod "keda-metrics-apiserver-7c9f485588-92n7z" (UID: "ab65412c-a5f8-4c9b-8fc1-8d643ec9e337") : references non-existent secret key: tls.crt Apr 24 21:20:34.734586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.734496 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.736974 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.736949 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aee887d1-e495-4cf8-b0e9-f00462766eef-certificates\") pod \"keda-admission-cf49989db-ldftr\" (UID: \"aee887d1-e495-4cf8-b0e9-f00462766eef\") " pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:34.918704 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:34.918667 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:35.038555 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:35.038519 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ldftr"] Apr 24 21:20:35.042411 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:20:35.042375 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee887d1_e495_4cf8_b0e9_f00462766eef.slice/crio-e7924db4e2002d79d594a6f091a487ef461aa2ba9a2c40c50813dfb3812cad13 WatchSource:0}: Error finding container e7924db4e2002d79d594a6f091a487ef461aa2ba9a2c40c50813dfb3812cad13: Status 404 returned error can't find the container with id e7924db4e2002d79d594a6f091a487ef461aa2ba9a2c40c50813dfb3812cad13 Apr 24 21:20:35.074460 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:35.074432 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ldftr" event={"ID":"aee887d1-e495-4cf8-b0e9-f00462766eef","Type":"ContainerStarted","Data":"e7924db4e2002d79d594a6f091a487ef461aa2ba9a2c40c50813dfb3812cad13"} Apr 24 21:20:35.138367 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:35.138338 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:35.138731 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.138463 2560 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:35.138731 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.138474 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:35.138731 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.138483 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c6xjz: references non-existent secret key: ca.crt Apr 24 21:20:35.138731 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.138538 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates podName:df108736-62fe-468e-8409-3ffd1cb43473 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:37.138524355 +0000 UTC m=+225.197312005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates") pod "keda-operator-ffbb595cb-c6xjz" (UID: "df108736-62fe-468e-8409-3ffd1cb43473") : references non-existent secret key: ca.crt Apr 24 21:20:35.441182 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:35.441141 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:35.441383 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.441283 2560 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:35.441383 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.441304 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:35.441383 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.441327 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z: references non-existent secret key: tls.crt Apr 24 21:20:35.441548 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:35.441389 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates podName:ab65412c-a5f8-4c9b-8fc1-8d643ec9e337 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:37.4413702 +0000 UTC m=+225.500157851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates") pod "keda-metrics-apiserver-7c9f485588-92n7z" (UID: "ab65412c-a5f8-4c9b-8fc1-8d643ec9e337") : references non-existent secret key: tls.crt Apr 24 21:20:37.080244 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.080164 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ldftr" event={"ID":"aee887d1-e495-4cf8-b0e9-f00462766eef","Type":"ContainerStarted","Data":"76e1cacbdffb815aebdb059cd81e103227177e54b793cd6056b32aa665f5e2cd"} Apr 24 21:20:37.080625 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.080278 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:20:37.096400 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.096346 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-ldftr" podStartSLOduration=2.319081888 podStartE2EDuration="4.096331844s" podCreationTimestamp="2026-04-24 21:20:33 +0000 UTC" firstStartedPulling="2026-04-24 21:20:35.043682358 +0000 UTC m=+223.102470008" lastFinishedPulling="2026-04-24 21:20:36.820932301 +0000 UTC m=+224.879719964" observedRunningTime="2026-04-24 21:20:37.095354231 +0000 UTC m=+225.154141926" watchObservedRunningTime="2026-04-24 21:20:37.096331844 +0000 UTC m=+225.155119516" Apr 24 21:20:37.155659 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.155632 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:37.155782 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:37.155721 2560 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:37.155782 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:37.155732 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:37.155782 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:37.155740 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c6xjz: references non-existent secret key: ca.crt Apr 24 21:20:37.155900 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:20:37.155802 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates podName:df108736-62fe-468e-8409-3ffd1cb43473 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:41.155785625 +0000 UTC m=+229.214573279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates") pod "keda-operator-ffbb595cb-c6xjz" (UID: "df108736-62fe-468e-8409-3ffd1cb43473") : references non-existent secret key: ca.crt Apr 24 21:20:37.456817 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.456752 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:37.459082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.459065 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ab65412c-a5f8-4c9b-8fc1-8d643ec9e337-certificates\") pod \"keda-metrics-apiserver-7c9f485588-92n7z\" (UID: \"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:37.690955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.690918 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:37.802429 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:37.802402 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z"] Apr 24 21:20:37.805154 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:20:37.805127 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab65412c_a5f8_4c9b_8fc1_8d643ec9e337.slice/crio-85200bfc39569052dcc687029573f87980f0ea7664decdcfaf9e60acf3859a7d WatchSource:0}: Error finding container 85200bfc39569052dcc687029573f87980f0ea7664decdcfaf9e60acf3859a7d: Status 404 returned error can't find the container with id 85200bfc39569052dcc687029573f87980f0ea7664decdcfaf9e60acf3859a7d Apr 24 21:20:38.083553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:38.083520 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" event={"ID":"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337","Type":"ContainerStarted","Data":"85200bfc39569052dcc687029573f87980f0ea7664decdcfaf9e60acf3859a7d"} Apr 24 21:20:41.093193 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.093153 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" event={"ID":"ab65412c-a5f8-4c9b-8fc1-8d643ec9e337","Type":"ContainerStarted","Data":"0a3a0e773b9248ac4a97c1457b21073227f47318a3889f773e0b7b9a7ea92e0d"} Apr 24 21:20:41.093599 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.093282 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:41.109838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.109768 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" podStartSLOduration=5.784473133 podStartE2EDuration="8.109753973s" podCreationTimestamp="2026-04-24 21:20:33 +0000 UTC" firstStartedPulling="2026-04-24 21:20:37.806393423 +0000 UTC m=+225.865181073" lastFinishedPulling="2026-04-24 21:20:40.131674252 +0000 UTC m=+228.190461913" observedRunningTime="2026-04-24 21:20:41.108901889 +0000 UTC m=+229.167689557" watchObservedRunningTime="2026-04-24 21:20:41.109753973 +0000 UTC m=+229.168541644" Apr 24 21:20:41.187090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.187052 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:41.189400 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.189378 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df108736-62fe-468e-8409-3ffd1cb43473-certificates\") pod \"keda-operator-ffbb595cb-c6xjz\" (UID: \"df108736-62fe-468e-8409-3ffd1cb43473\") " pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:41.275627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.275587 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:41.389750 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:41.389712 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c6xjz"] Apr 24 21:20:41.392812 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:20:41.392787 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf108736_62fe_468e_8409_3ffd1cb43473.slice/crio-1057beb28c3897c3f45a89804f72c1bde7bf66ad5e11272307163bbe93513f52 WatchSource:0}: Error finding container 1057beb28c3897c3f45a89804f72c1bde7bf66ad5e11272307163bbe93513f52: Status 404 returned error can't find the container with id 1057beb28c3897c3f45a89804f72c1bde7bf66ad5e11272307163bbe93513f52 Apr 24 21:20:42.097109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:42.097073 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" event={"ID":"df108736-62fe-468e-8409-3ffd1cb43473","Type":"ContainerStarted","Data":"1057beb28c3897c3f45a89804f72c1bde7bf66ad5e11272307163bbe93513f52"} Apr 24 21:20:45.106627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:45.106542 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" event={"ID":"df108736-62fe-468e-8409-3ffd1cb43473","Type":"ContainerStarted","Data":"d5a1cdefb84832e15e5f259f1ce75cca78c3913c70d9732a5a738034030fb8ee"} Apr 24 21:20:45.107037 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:45.106672 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:20:45.125597 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:45.125546 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" podStartSLOduration=8.723364473 podStartE2EDuration="12.125497229s" podCreationTimestamp="2026-04-24 21:20:33 +0000 UTC" firstStartedPulling="2026-04-24 21:20:41.39459061 +0000 UTC m=+229.453378260" lastFinishedPulling="2026-04-24 21:20:44.796723363 +0000 UTC m=+232.855511016" observedRunningTime="2026-04-24 21:20:45.125309973 +0000 UTC m=+233.184097646" watchObservedRunningTime="2026-04-24 21:20:45.125497229 +0000 UTC m=+233.184284904" Apr 24 21:20:52.101483 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:52.101443 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-92n7z" Apr 24 21:20:54.074125 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:54.074096 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8xrxd" Apr 24 21:20:58.086392 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:20:58.086356 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-ldftr" Apr 24 21:21:06.110787 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:06.110755 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-c6xjz" Apr 24 21:21:27.679695 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.679611 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm"] Apr 24 21:21:27.684259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.684242 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.686722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.686699 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:21:27.686863 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.686819 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:21:27.687672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.687654 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:21:27.690547 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.690523 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm"] Apr 24 21:21:27.808296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.808252 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.808475 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.808307 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.808475 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.808415 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpnlk\" (UniqueName: \"kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.909588 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.909555 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpnlk\" (UniqueName: \"kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.909722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.909620 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.909722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.909645 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.910046 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.910026 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.910090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.910058 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.917142 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.917112 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpnlk\" (UniqueName: \"kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:27.993919 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:27.993841 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:28.107359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:28.107330 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm"] Apr 24 21:21:28.110535 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:21:28.110501 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ae47a1_6977_4937_a99b_f6efa1bcc0ef.slice/crio-5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee WatchSource:0}: Error finding container 5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee: Status 404 returned error can't find the container with id 5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee Apr 24 21:21:28.217024 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:28.216995 2560 generic.go:358] "Generic (PLEG): container finished" podID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerID="a43d3f1dd795a947700588277d638bdc0fd9e3ca8ed1f6a6babbdfcd280a5c36" exitCode=0 Apr 24 21:21:28.217141 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:28.217081 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" event={"ID":"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef","Type":"ContainerDied","Data":"a43d3f1dd795a947700588277d638bdc0fd9e3ca8ed1f6a6babbdfcd280a5c36"} Apr 24 21:21:28.217141 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:28.217115 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" event={"ID":"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef","Type":"ContainerStarted","Data":"5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee"} Apr 24 21:21:30.224668 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:30.224635 2560 generic.go:358] "Generic (PLEG): container finished" podID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerID="1a0df0ea69cf64d9b381ddc7358d17d21e59c3491b5564205440122f139530ca" exitCode=0 Apr 24 21:21:30.225133 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:30.224674 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" event={"ID":"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef","Type":"ContainerDied","Data":"1a0df0ea69cf64d9b381ddc7358d17d21e59c3491b5564205440122f139530ca"} Apr 24 21:21:31.229023 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:31.228990 2560 generic.go:358] "Generic (PLEG): container finished" podID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerID="7d6cfcb3dcfad054b56613d9e7560a74abd5198be7cf93a861b3e5d0d7be299f" exitCode=0 Apr 24 21:21:31.229401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:31.229064 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" event={"ID":"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef","Type":"ContainerDied","Data":"7d6cfcb3dcfad054b56613d9e7560a74abd5198be7cf93a861b3e5d0d7be299f"} Apr 24 21:21:32.352790 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.352766 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:32.547560 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.547518 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpnlk\" (UniqueName: \"kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk\") pod \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " Apr 24 21:21:32.547748 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.547586 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle\") pod \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " Apr 24 21:21:32.547748 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.547636 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util\") pod \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\" (UID: \"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef\") " Apr 24 21:21:32.548320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.548296 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle" (OuterVolumeSpecName: "bundle") pod "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" (UID: "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:21:32.549557 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.549523 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk" (OuterVolumeSpecName: "kube-api-access-dpnlk") pod "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" (UID: "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef"). InnerVolumeSpecName "kube-api-access-dpnlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:21:32.555472 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.555447 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util" (OuterVolumeSpecName: "util") pod "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" (UID: "c2ae47a1-6977-4937-a99b-f6efa1bcc0ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:21:32.648543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.648496 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:21:32.648543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.648538 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpnlk\" (UniqueName: \"kubernetes.io/projected/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-kube-api-access-dpnlk\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:21:32.648543 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:32.648549 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2ae47a1-6977-4937-a99b-f6efa1bcc0ef-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:21:33.236879 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:33.236842 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" event={"ID":"c2ae47a1-6977-4937-a99b-f6efa1bcc0ef","Type":"ContainerDied","Data":"5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee"} Apr 24 21:21:33.236879 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:33.236861 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19df8glm" Apr 24 21:21:33.236879 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:33.236876 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ce4b448cb23d8535701cca3e35a83961eb568d7b31c2ccd2901de7f246d13ee" Apr 24 21:21:40.796426 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796391 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l"] Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796611 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="extract" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796621 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="extract" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796637 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="util" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796642 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="util" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796647 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="pull" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796653 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="pull" Apr 24 21:21:40.796810 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.796687 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2ae47a1-6977-4937-a99b-f6efa1bcc0ef" containerName="extract" Apr 24 21:21:40.798816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.798799 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:40.811401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.811381 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 21:21:40.811638 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.811622 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-9cjwq\"" Apr 24 21:21:40.812200 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.812185 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:21:40.821914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.821894 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l"] Apr 24 21:21:40.896882 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.896851 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/992985bd-1f99-4161-a262-07c0d46c1da0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:40.897030 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.896902 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq9gl\" (UniqueName: \"kubernetes.io/projected/992985bd-1f99-4161-a262-07c0d46c1da0-kube-api-access-xq9gl\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:40.997817 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.997781 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/992985bd-1f99-4161-a262-07c0d46c1da0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:40.997993 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.997843 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq9gl\" (UniqueName: \"kubernetes.io/projected/992985bd-1f99-4161-a262-07c0d46c1da0-kube-api-access-xq9gl\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:40.998175 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:40.998154 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/992985bd-1f99-4161-a262-07c0d46c1da0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:41.005896 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:41.005871 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq9gl\" (UniqueName: \"kubernetes.io/projected/992985bd-1f99-4161-a262-07c0d46c1da0-kube-api-access-xq9gl\") pod \"cert-manager-operator-controller-manager-54b9655956-cs98l\" (UID: \"992985bd-1f99-4161-a262-07c0d46c1da0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:41.107564 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:41.107485 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" Apr 24 21:21:41.242906 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:41.242879 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l"] Apr 24 21:21:41.245840 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:21:41.245809 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992985bd_1f99_4161_a262_07c0d46c1da0.slice/crio-7df2b108dbb279b7dba35788ae4147a4f849e2c0f9a80a213c4e3549ee62905f WatchSource:0}: Error finding container 7df2b108dbb279b7dba35788ae4147a4f849e2c0f9a80a213c4e3549ee62905f: Status 404 returned error can't find the container with id 7df2b108dbb279b7dba35788ae4147a4f849e2c0f9a80a213c4e3549ee62905f Apr 24 21:21:41.257977 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:41.257954 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" event={"ID":"992985bd-1f99-4161-a262-07c0d46c1da0","Type":"ContainerStarted","Data":"7df2b108dbb279b7dba35788ae4147a4f849e2c0f9a80a213c4e3549ee62905f"} Apr 24 21:21:43.265717 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:43.265678 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" event={"ID":"992985bd-1f99-4161-a262-07c0d46c1da0","Type":"ContainerStarted","Data":"9e27bdcabc75bc05ce6b9505fc4ab9263ed06ee36ff53799487e0448eadf7866"} Apr 24 21:21:49.189221 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.189173 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-cs98l" podStartSLOduration=7.51476377 podStartE2EDuration="9.189156847s" podCreationTimestamp="2026-04-24 21:21:40 +0000 UTC" firstStartedPulling="2026-04-24 21:21:41.248086178 +0000 UTC m=+289.306873828" lastFinishedPulling="2026-04-24 21:21:42.92247925 +0000 UTC m=+290.981266905" observedRunningTime="2026-04-24 21:21:43.30572541 +0000 UTC m=+291.364513081" watchObservedRunningTime="2026-04-24 21:21:49.189156847 +0000 UTC m=+297.247944518" Apr 24 21:21:49.189604 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.189551 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-46v4x"] Apr 24 21:21:49.191573 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.191558 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.193804 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.193784 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2vmk5\"" Apr 24 21:21:49.193936 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.193797 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 21:21:49.194584 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.194568 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 21:21:49.199340 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.199314 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-46v4x"] Apr 24 21:21:49.258240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.258217 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.258338 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.258246 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnsjd\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-kube-api-access-wnsjd\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.359120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.359090 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.359120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.359130 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnsjd\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-kube-api-access-wnsjd\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.366989 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.366957 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.367121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.367073 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnsjd\" (UniqueName: \"kubernetes.io/projected/3aede771-e2db-428a-84c6-2aaf1cc033b0-kube-api-access-wnsjd\") pod \"cert-manager-cainjector-68b757865b-46v4x\" (UID: \"3aede771-e2db-428a-84c6-2aaf1cc033b0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.513116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.513091 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" Apr 24 21:21:49.624964 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.624915 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-46v4x"] Apr 24 21:21:49.628064 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:21:49.628036 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aede771_e2db_428a_84c6_2aaf1cc033b0.slice/crio-321070bb5ccd96ee2a1ccc6a53f7a72f4803a3ea1beb5acd41980c751443f984 WatchSource:0}: Error finding container 321070bb5ccd96ee2a1ccc6a53f7a72f4803a3ea1beb5acd41980c751443f984: Status 404 returned error can't find the container with id 321070bb5ccd96ee2a1ccc6a53f7a72f4803a3ea1beb5acd41980c751443f984 Apr 24 21:21:49.787460 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.787371 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8"] Apr 24 21:21:49.790146 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.790125 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.792524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.792506 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:21:49.793314 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.793290 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:21:49.793314 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.793303 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:21:49.798701 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.798681 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8"] Apr 24 21:21:49.863075 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.863041 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.863075 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.863074 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfdx\" (UniqueName: \"kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.863268 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.863102 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.963997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.963966 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.964123 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.964003 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfdx\" (UniqueName: \"kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.964123 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.964033 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.964374 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.964355 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.964407 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.964380 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:49.972046 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:49.972021 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfdx\" (UniqueName: \"kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:50.099729 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:50.099653 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:50.214574 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:50.214547 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8"] Apr 24 21:21:50.217296 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:21:50.217272 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29507f86_6cef_459d_9622_57284028d137.slice/crio-88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d WatchSource:0}: Error finding container 88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d: Status 404 returned error can't find the container with id 88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d Apr 24 21:21:50.289438 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:50.289403 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerStarted","Data":"408ba62901e9f392f23c772f38c38887f1ae8572588311d10f7aa915540c2431"} Apr 24 21:21:50.289438 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:50.289442 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerStarted","Data":"88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d"} Apr 24 21:21:50.290629 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:50.290606 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" event={"ID":"3aede771-e2db-428a-84c6-2aaf1cc033b0","Type":"ContainerStarted","Data":"321070bb5ccd96ee2a1ccc6a53f7a72f4803a3ea1beb5acd41980c751443f984"} Apr 24 21:21:51.294639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:51.294606 2560 generic.go:358] "Generic (PLEG): container finished" podID="29507f86-6cef-459d-9622-57284028d137" containerID="408ba62901e9f392f23c772f38c38887f1ae8572588311d10f7aa915540c2431" exitCode=0 Apr 24 21:21:51.295036 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:51.294691 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerDied","Data":"408ba62901e9f392f23c772f38c38887f1ae8572588311d10f7aa915540c2431"} Apr 24 21:21:52.334997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:52.334967 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:21:52.335476 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:52.335458 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:21:52.336056 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:52.336028 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:21:52.336491 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:52.336475 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:21:52.338629 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:52.338612 2560 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:21:54.306517 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:54.306481 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" event={"ID":"3aede771-e2db-428a-84c6-2aaf1cc033b0","Type":"ContainerStarted","Data":"9b57de7ce11c306d486a086812392beee85a2a32d8085fb941dc407f4fd2b5a1"} Apr 24 21:21:54.325495 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:54.325440 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-46v4x" podStartSLOduration=0.873209494 podStartE2EDuration="5.325423644s" podCreationTimestamp="2026-04-24 21:21:49 +0000 UTC" firstStartedPulling="2026-04-24 21:21:49.63031104 +0000 UTC m=+297.689098690" lastFinishedPulling="2026-04-24 21:21:54.08252519 +0000 UTC m=+302.141312840" observedRunningTime="2026-04-24 21:21:54.32427054 +0000 UTC m=+302.383058215" watchObservedRunningTime="2026-04-24 21:21:54.325423644 +0000 UTC m=+302.384211316" Apr 24 21:21:55.016546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.016514 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hljkt"] Apr 24 21:21:55.019534 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.019518 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.021538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.021519 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-stf6m\"" Apr 24 21:21:55.029151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.029128 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hljkt"] Apr 24 21:21:55.102025 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.101994 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-bound-sa-token\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.102176 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.102049 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6h2\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-kube-api-access-dv6h2\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.202394 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.202366 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6h2\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-kube-api-access-dv6h2\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.202553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.202413 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-bound-sa-token\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.210489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.210465 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-bound-sa-token\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.210633 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.210615 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6h2\" (UniqueName: \"kubernetes.io/projected/83303ed0-daca-42b0-8259-d3e9caa96140-kube-api-access-dv6h2\") pod \"cert-manager-79c8d999ff-hljkt\" (UID: \"83303ed0-daca-42b0-8259-d3e9caa96140\") " pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.327828 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.327750 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hljkt" Apr 24 21:21:55.443399 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.443304 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hljkt"] Apr 24 21:21:55.446436 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:21:55.446400 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83303ed0_daca_42b0_8259_d3e9caa96140.slice/crio-d8230b7685e42560e30e0e5f97ff2b68e95a5c55f5dcf2ede1454db3017da0e4 WatchSource:0}: Error finding container d8230b7685e42560e30e0e5f97ff2b68e95a5c55f5dcf2ede1454db3017da0e4: Status 404 returned error can't find the container with id d8230b7685e42560e30e0e5f97ff2b68e95a5c55f5dcf2ede1454db3017da0e4 Apr 24 21:21:55.449518 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:55.449501 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:21:56.314494 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:56.314461 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hljkt" event={"ID":"83303ed0-daca-42b0-8259-d3e9caa96140","Type":"ContainerStarted","Data":"d68df3ce036108723e4fde6a7f1703088fa48fcaa2124fcdfa5f31f470a5e59f"} Apr 24 21:21:56.314621 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:56.314498 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hljkt" event={"ID":"83303ed0-daca-42b0-8259-d3e9caa96140","Type":"ContainerStarted","Data":"d8230b7685e42560e30e0e5f97ff2b68e95a5c55f5dcf2ede1454db3017da0e4"} Apr 24 21:21:56.316113 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:56.316093 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerStarted","Data":"0ab8dea04c66231ae08f8b7fad4fd8b79a9858cc318ab6f2a209ddb6923e8c80"} Apr 24 21:21:56.339946 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:56.339888 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-hljkt" podStartSLOduration=1.339872589 podStartE2EDuration="1.339872589s" podCreationTimestamp="2026-04-24 21:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:21:56.338555903 +0000 UTC m=+304.397343576" watchObservedRunningTime="2026-04-24 21:21:56.339872589 +0000 UTC m=+304.398660261" Apr 24 21:21:57.320784 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:57.320750 2560 generic.go:358] "Generic (PLEG): container finished" podID="29507f86-6cef-459d-9622-57284028d137" containerID="0ab8dea04c66231ae08f8b7fad4fd8b79a9858cc318ab6f2a209ddb6923e8c80" exitCode=0 Apr 24 21:21:57.320969 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:57.320835 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerDied","Data":"0ab8dea04c66231ae08f8b7fad4fd8b79a9858cc318ab6f2a209ddb6923e8c80"} Apr 24 21:21:58.326558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:58.326524 2560 generic.go:358] "Generic (PLEG): container finished" podID="29507f86-6cef-459d-9622-57284028d137" containerID="642c8a49fb5660acb62a19feca4953a81c21564073ad575b56450178c4af27af" exitCode=0 Apr 24 21:21:58.326996 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:58.326570 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerDied","Data":"642c8a49fb5660acb62a19feca4953a81c21564073ad575b56450178c4af27af"} Apr 24 21:21:59.446215 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.446194 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:21:59.534489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.534456 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfdx\" (UniqueName: \"kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx\") pod \"29507f86-6cef-459d-9622-57284028d137\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " Apr 24 21:21:59.534654 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.534507 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util\") pod \"29507f86-6cef-459d-9622-57284028d137\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " Apr 24 21:21:59.534654 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.534561 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle\") pod \"29507f86-6cef-459d-9622-57284028d137\" (UID: \"29507f86-6cef-459d-9622-57284028d137\") " Apr 24 21:21:59.535004 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.534973 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle" (OuterVolumeSpecName: "bundle") pod "29507f86-6cef-459d-9622-57284028d137" (UID: "29507f86-6cef-459d-9622-57284028d137"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:21:59.536572 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.536542 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx" (OuterVolumeSpecName: "kube-api-access-6zfdx") pod "29507f86-6cef-459d-9622-57284028d137" (UID: "29507f86-6cef-459d-9622-57284028d137"). InnerVolumeSpecName "kube-api-access-6zfdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:21:59.539232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.539211 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util" (OuterVolumeSpecName: "util") pod "29507f86-6cef-459d-9622-57284028d137" (UID: "29507f86-6cef-459d-9622-57284028d137"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:21:59.635241 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.635150 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zfdx\" (UniqueName: \"kubernetes.io/projected/29507f86-6cef-459d-9622-57284028d137-kube-api-access-6zfdx\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:21:59.635241 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.635183 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:21:59.635241 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:21:59.635197 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29507f86-6cef-459d-9622-57284028d137-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:00.336114 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:00.336085 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" Apr 24 21:22:00.336284 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:00.336080 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fd5cr8" event={"ID":"29507f86-6cef-459d-9622-57284028d137","Type":"ContainerDied","Data":"88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d"} Apr 24 21:22:00.336284 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:00.336193 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b434bdb3477f6985a03fcd37baec84318c50a896e979e5cf15f7b979686f7d" Apr 24 21:22:08.446189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446155 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm"] Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446392 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="util" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446402 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="util" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446410 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="pull" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446415 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="pull" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446421 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="extract" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446427 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="extract" Apr 24 21:22:08.446552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.446467 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="29507f86-6cef-459d-9622-57284028d137" containerName="extract" Apr 24 21:22:08.449390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.449372 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.451799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.451780 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 21:22:08.451861 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.451796 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-6dzth\"" Apr 24 21:22:08.452555 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.452540 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:22:08.456838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.456818 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm"] Apr 24 21:22:08.497603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.497576 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvsv\" (UniqueName: \"kubernetes.io/projected/486f6b20-c896-4f8a-8188-84a01590a54a-kube-api-access-5xvsv\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.497742 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.497609 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/486f6b20-c896-4f8a-8188-84a01590a54a-tmp\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.598183 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.598156 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/486f6b20-c896-4f8a-8188-84a01590a54a-tmp\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.598307 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.598227 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvsv\" (UniqueName: \"kubernetes.io/projected/486f6b20-c896-4f8a-8188-84a01590a54a-kube-api-access-5xvsv\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.598524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.598507 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/486f6b20-c896-4f8a-8188-84a01590a54a-tmp\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.606246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.606221 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvsv\" (UniqueName: \"kubernetes.io/projected/486f6b20-c896-4f8a-8188-84a01590a54a-kube-api-access-5xvsv\") pod \"openshift-lws-operator-bfc7f696d-w7ggm\" (UID: \"486f6b20-c896-4f8a-8188-84a01590a54a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.758978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.758956 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" Apr 24 21:22:08.870795 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:08.870721 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm"] Apr 24 21:22:08.873177 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:08.873149 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486f6b20_c896_4f8a_8188_84a01590a54a.slice/crio-464b7b3aea19779c7911084c3b4156dc37b2d499aeef06f5a839501a0ebff957 WatchSource:0}: Error finding container 464b7b3aea19779c7911084c3b4156dc37b2d499aeef06f5a839501a0ebff957: Status 404 returned error can't find the container with id 464b7b3aea19779c7911084c3b4156dc37b2d499aeef06f5a839501a0ebff957 Apr 24 21:22:09.363608 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:09.363573 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" event={"ID":"486f6b20-c896-4f8a-8188-84a01590a54a","Type":"ContainerStarted","Data":"464b7b3aea19779c7911084c3b4156dc37b2d499aeef06f5a839501a0ebff957"} Apr 24 21:22:11.372012 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:11.371977 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" event={"ID":"486f6b20-c896-4f8a-8188-84a01590a54a","Type":"ContainerStarted","Data":"06d38750ad4a58366c7bb1e87d741a2f9d3042fe53a9354cd79464784ac097df"} Apr 24 21:22:11.389358 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:11.389312 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-w7ggm" podStartSLOduration=1.809481933 podStartE2EDuration="3.389297565s" podCreationTimestamp="2026-04-24 21:22:08 +0000 UTC" firstStartedPulling="2026-04-24 21:22:08.874720882 +0000 UTC m=+316.933508535" lastFinishedPulling="2026-04-24 21:22:10.454536516 +0000 UTC m=+318.513324167" observedRunningTime="2026-04-24 21:22:11.387169887 +0000 UTC m=+319.445957559" watchObservedRunningTime="2026-04-24 21:22:11.389297565 +0000 UTC m=+319.448085236" Apr 24 21:22:22.001283 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.001241 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk"] Apr 24 21:22:22.003600 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.003575 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.005905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.005874 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:22:22.005905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.005879 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:22:22.006731 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.006713 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:22:22.012788 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.012767 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk"] Apr 24 21:22:22.095504 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.095471 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.095698 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.095522 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9k65\" (UniqueName: \"kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.095698 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.095640 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.196612 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.196574 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.196788 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.196624 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9k65\" (UniqueName: \"kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.196788 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.196659 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.197031 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.197010 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.197071 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.197031 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.206239 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.206218 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9k65\" (UniqueName: \"kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.313408 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.313369 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:22.430299 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:22.430271 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk"] Apr 24 21:22:22.433188 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:22.433152 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43005bc2_f9e6_4f5d_a9b7_9605b678cb42.slice/crio-21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f WatchSource:0}: Error finding container 21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f: Status 404 returned error can't find the container with id 21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f Apr 24 21:22:23.416395 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:23.416359 2560 generic.go:358] "Generic (PLEG): container finished" podID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerID="347a229877ddb8a83f1597d0ca50b1d5bb2b9092ddc7e7c3b2e7bf61f6d5564a" exitCode=0 Apr 24 21:22:23.416771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:23.416443 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" event={"ID":"43005bc2-f9e6-4f5d-a9b7-9605b678cb42","Type":"ContainerDied","Data":"347a229877ddb8a83f1597d0ca50b1d5bb2b9092ddc7e7c3b2e7bf61f6d5564a"} Apr 24 21:22:23.416771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:23.416485 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" event={"ID":"43005bc2-f9e6-4f5d-a9b7-9605b678cb42","Type":"ContainerStarted","Data":"21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f"} Apr 24 21:22:24.421690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:24.421656 2560 generic.go:358] "Generic (PLEG): container finished" podID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerID="de36a7157e4d427b38e353157fab186305a81b8bc006b4f008b45fcc2e2e46e9" exitCode=0 Apr 24 21:22:24.422059 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:24.421705 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" event={"ID":"43005bc2-f9e6-4f5d-a9b7-9605b678cb42","Type":"ContainerDied","Data":"de36a7157e4d427b38e353157fab186305a81b8bc006b4f008b45fcc2e2e46e9"} Apr 24 21:22:25.426971 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:25.426936 2560 generic.go:358] "Generic (PLEG): container finished" podID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerID="40e38f970b8fe6be551de9deb3447f94d6041501af96bf04461d2a174f636417" exitCode=0 Apr 24 21:22:25.427314 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:25.427023 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" event={"ID":"43005bc2-f9e6-4f5d-a9b7-9605b678cb42","Type":"ContainerDied","Data":"40e38f970b8fe6be551de9deb3447f94d6041501af96bf04461d2a174f636417"} Apr 24 21:22:26.554524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.554495 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:26.628609 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.628577 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle\") pod \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " Apr 24 21:22:26.628771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.628620 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9k65\" (UniqueName: \"kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65\") pod \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " Apr 24 21:22:26.628771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.628666 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util\") pod \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\" (UID: \"43005bc2-f9e6-4f5d-a9b7-9605b678cb42\") " Apr 24 21:22:26.629476 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.629441 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle" (OuterVolumeSpecName: "bundle") pod "43005bc2-f9e6-4f5d-a9b7-9605b678cb42" (UID: "43005bc2-f9e6-4f5d-a9b7-9605b678cb42"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:26.630718 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.630694 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65" (OuterVolumeSpecName: "kube-api-access-j9k65") pod "43005bc2-f9e6-4f5d-a9b7-9605b678cb42" (UID: "43005bc2-f9e6-4f5d-a9b7-9605b678cb42"). InnerVolumeSpecName "kube-api-access-j9k65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:22:26.634006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.633972 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util" (OuterVolumeSpecName: "util") pod "43005bc2-f9e6-4f5d-a9b7-9605b678cb42" (UID: "43005bc2-f9e6-4f5d-a9b7-9605b678cb42"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:26.729280 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.729194 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:26.729280 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.729232 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9k65\" (UniqueName: \"kubernetes.io/projected/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-kube-api-access-j9k65\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:26.729280 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:26.729244 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43005bc2-f9e6-4f5d-a9b7-9605b678cb42-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:27.434934 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:27.434881 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" event={"ID":"43005bc2-f9e6-4f5d-a9b7-9605b678cb42","Type":"ContainerDied","Data":"21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f"} Apr 24 21:22:27.435098 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:27.434898 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pssrk" Apr 24 21:22:27.435098 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:27.434949 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21b6fbc275e5cedb373ce7fdcc7e5139393163671ef55a9c7f6416a781ec651f" Apr 24 21:22:31.617084 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617054 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj"] Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617313 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="extract" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617324 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="extract" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617345 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="util" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617351 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="util" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617359 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="pull" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617364 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="pull" Apr 24 21:22:31.617570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.617409 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="43005bc2-f9e6-4f5d-a9b7-9605b678cb42" containerName="extract" Apr 24 21:22:31.623282 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.623265 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.626655 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.626632 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 21:22:31.626785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.626652 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 21:22:31.626785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.626639 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 21:22:31.626785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.626705 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kt5gk\"" Apr 24 21:22:31.628883 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.628858 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj"] Apr 24 21:22:31.767162 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.767125 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.767331 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.767173 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvthl\" (UniqueName: \"kubernetes.io/projected/43f237ab-989c-487f-90ed-d10683e377ab-kube-api-access-gvthl\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.767331 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.767204 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-metrics-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.767331 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.767268 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43f237ab-989c-487f-90ed-d10683e377ab-manager-config\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.868658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.868580 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.868658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.868621 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvthl\" (UniqueName: \"kubernetes.io/projected/43f237ab-989c-487f-90ed-d10683e377ab-kube-api-access-gvthl\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.868658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.868649 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-metrics-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.868870 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.868677 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43f237ab-989c-487f-90ed-d10683e377ab-manager-config\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.869273 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.869252 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43f237ab-989c-487f-90ed-d10683e377ab-manager-config\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.871139 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.871116 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.871228 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.871144 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43f237ab-989c-487f-90ed-d10683e377ab-metrics-cert\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.877613 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.877590 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvthl\" (UniqueName: \"kubernetes.io/projected/43f237ab-989c-487f-90ed-d10683e377ab-kube-api-access-gvthl\") pod \"lws-controller-manager-7f997f587c-sbdmj\" (UID: \"43f237ab-989c-487f-90ed-d10683e377ab\") " pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:31.933473 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:31.933437 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:32.079703 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:32.079651 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj"] Apr 24 21:22:32.081774 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:32.081739 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f237ab_989c_487f_90ed_d10683e377ab.slice/crio-f45bc4ca0d613a19e104fd88f2193e8de0729b9596a1bdde78a4e2c8eba38bc1 WatchSource:0}: Error finding container f45bc4ca0d613a19e104fd88f2193e8de0729b9596a1bdde78a4e2c8eba38bc1: Status 404 returned error can't find the container with id f45bc4ca0d613a19e104fd88f2193e8de0729b9596a1bdde78a4e2c8eba38bc1 Apr 24 21:22:32.451901 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:32.451867 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" event={"ID":"43f237ab-989c-487f-90ed-d10683e377ab","Type":"ContainerStarted","Data":"f45bc4ca0d613a19e104fd88f2193e8de0729b9596a1bdde78a4e2c8eba38bc1"} Apr 24 21:22:34.459686 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:34.459652 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" event={"ID":"43f237ab-989c-487f-90ed-d10683e377ab","Type":"ContainerStarted","Data":"0fe7244499bb28b95750852062b7a436efe8a34f8098e178b8c793eb6edf35fb"} Apr 24 21:22:34.460073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:34.459728 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:34.478089 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:34.478043 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" podStartSLOduration=1.196832063 podStartE2EDuration="3.478026257s" podCreationTimestamp="2026-04-24 21:22:31 +0000 UTC" firstStartedPulling="2026-04-24 21:22:32.083454776 +0000 UTC m=+340.142242426" lastFinishedPulling="2026-04-24 21:22:34.364648968 +0000 UTC m=+342.423436620" observedRunningTime="2026-04-24 21:22:34.477031362 +0000 UTC m=+342.535819056" watchObservedRunningTime="2026-04-24 21:22:34.478026257 +0000 UTC m=+342.536813929" Apr 24 21:22:36.313991 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.313954 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p"] Apr 24 21:22:36.317325 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.317308 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.320744 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.320718 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:22:36.320876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.320848 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:22:36.321618 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.321603 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:22:36.342105 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.342080 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p"] Apr 24 21:22:36.406032 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.406001 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxkz\" (UniqueName: \"kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.406197 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.406060 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.406197 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.406100 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.507117 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.507075 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.507306 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.507130 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxkz\" (UniqueName: \"kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.507306 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.507198 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.507541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.507514 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.507600 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.507529 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.532993 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.532960 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxkz\" (UniqueName: \"kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.626092 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.626010 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:36.809723 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:36.809700 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p"] Apr 24 21:22:36.811619 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:36.811588 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde231e75_94d2_4ea9_a693_f1d2c2c281a0.slice/crio-7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd WatchSource:0}: Error finding container 7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd: Status 404 returned error can't find the container with id 7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd Apr 24 21:22:37.470124 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:37.470088 2560 generic.go:358] "Generic (PLEG): container finished" podID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerID="ca5594cab5e3a2900eda199df34256e3555388ecb8f83caee5f285b8b6f9ff1b" exitCode=0 Apr 24 21:22:37.470594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:37.470135 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" event={"ID":"de231e75-94d2-4ea9-a693-f1d2c2c281a0","Type":"ContainerDied","Data":"ca5594cab5e3a2900eda199df34256e3555388ecb8f83caee5f285b8b6f9ff1b"} Apr 24 21:22:37.470594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:37.470154 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" event={"ID":"de231e75-94d2-4ea9-a693-f1d2c2c281a0","Type":"ContainerStarted","Data":"7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd"} Apr 24 21:22:38.474201 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:38.474122 2560 generic.go:358] "Generic (PLEG): container finished" podID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerID="a8fef97804120b4b0091b0fa468994dbd2501a8099f7c46ea0f4d4ac8d6fa8c6" exitCode=0 Apr 24 21:22:38.474640 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:38.474213 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" event={"ID":"de231e75-94d2-4ea9-a693-f1d2c2c281a0","Type":"ContainerDied","Data":"a8fef97804120b4b0091b0fa468994dbd2501a8099f7c46ea0f4d4ac8d6fa8c6"} Apr 24 21:22:39.478768 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:39.478735 2560 generic.go:358] "Generic (PLEG): container finished" podID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerID="6463a3a74bc7324759824bb469de2f8f14a2cd6c46ad67105dc7a9cee30ca98e" exitCode=0 Apr 24 21:22:39.479164 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:39.478823 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" event={"ID":"de231e75-94d2-4ea9-a693-f1d2c2c281a0","Type":"ContainerDied","Data":"6463a3a74bc7324759824bb469de2f8f14a2cd6c46ad67105dc7a9cee30ca98e"} Apr 24 21:22:40.595983 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.595956 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:40.740847 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.740764 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle\") pod \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " Apr 24 21:22:40.740847 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.740806 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util\") pod \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " Apr 24 21:22:40.740847 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.740840 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxkz\" (UniqueName: \"kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz\") pod \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\" (UID: \"de231e75-94d2-4ea9-a693-f1d2c2c281a0\") " Apr 24 21:22:40.741950 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.741902 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle" (OuterVolumeSpecName: "bundle") pod "de231e75-94d2-4ea9-a693-f1d2c2c281a0" (UID: "de231e75-94d2-4ea9-a693-f1d2c2c281a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:40.742877 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.742843 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz" (OuterVolumeSpecName: "kube-api-access-dgxkz") pod "de231e75-94d2-4ea9-a693-f1d2c2c281a0" (UID: "de231e75-94d2-4ea9-a693-f1d2c2c281a0"). InnerVolumeSpecName "kube-api-access-dgxkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:22:40.746398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.746377 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util" (OuterVolumeSpecName: "util") pod "de231e75-94d2-4ea9-a693-f1d2c2c281a0" (UID: "de231e75-94d2-4ea9-a693-f1d2c2c281a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:40.841539 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.841512 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:40.841539 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.841535 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de231e75-94d2-4ea9-a693-f1d2c2c281a0-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:40.841682 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:40.841544 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dgxkz\" (UniqueName: \"kubernetes.io/projected/de231e75-94d2-4ea9-a693-f1d2c2c281a0-kube-api-access-dgxkz\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:22:41.486781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:41.486707 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" event={"ID":"de231e75-94d2-4ea9-a693-f1d2c2c281a0","Type":"ContainerDied","Data":"7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd"} Apr 24 21:22:41.486781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:41.486739 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a75f86b19a9f447d647f0294a66dff427a6d120f85be8d48df8bdcb070735dd" Apr 24 21:22:41.486781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:41.486750 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebmlm6p" Apr 24 21:22:45.465831 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:45.465801 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7f997f587c-sbdmj" Apr 24 21:22:57.781835 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.781803 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8"] Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782081 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="pull" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782096 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="pull" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782107 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="extract" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782112 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="extract" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782119 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="util" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782125 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="util" Apr 24 21:22:57.782247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.782166 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="de231e75-94d2-4ea9-a693-f1d2c2c281a0" containerName="extract" Apr 24 21:22:57.785475 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.785459 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:57.787814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.787794 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:22:57.787814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.787805 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:22:57.788534 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.788517 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d9lp2\"" Apr 24 21:22:57.793508 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.793488 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8"] Apr 24 21:22:57.886467 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.886437 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns"] Apr 24 21:22:57.889532 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.889517 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:57.897103 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.897074 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns"] Apr 24 21:22:57.972112 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.972072 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:57.972256 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.972185 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:57.972256 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.972227 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv29b\" (UniqueName: \"kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:57.980526 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.980500 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m"] Apr 24 21:22:57.984258 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.984240 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:57.991872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:57.991850 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m"] Apr 24 21:22:58.073068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.072996 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.073068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073039 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgpl8\" (UniqueName: \"kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.073068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073062 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.073289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073113 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.073289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073158 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.073289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073189 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv29b\" (UniqueName: \"kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.073431 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073354 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.073517 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.073496 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.085062 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.085044 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv29b\" (UniqueName: \"kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.089786 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.089765 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7"] Apr 24 21:22:58.093654 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.093639 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.094501 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.094485 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:22:58.102209 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.102192 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7"] Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174489 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174617 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174661 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgpl8\" (UniqueName: \"kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174700 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79lj\" (UniqueName: \"kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174742 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174810 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.174523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.174903 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.175338 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.175212 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.183545 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.183524 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgpl8\" (UniqueName: \"kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.199006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.198982 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:22:58.213733 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.213706 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8"] Apr 24 21:22:58.214714 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:58.214687 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f596de_438b_47d3_ab76_c5c146ee8432.slice/crio-2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405 WatchSource:0}: Error finding container 2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405: Status 404 returned error can't find the container with id 2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405 Apr 24 21:22:58.275811 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.275785 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.275903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.275834 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.275903 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.275876 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t79lj\" (UniqueName: \"kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.276088 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.275983 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.276088 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.276031 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.276088 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.276068 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktqf\" (UniqueName: \"kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.276603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.276224 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.276603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.276391 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.284623 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.284594 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79lj\" (UniqueName: \"kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.293627 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.293360 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:22:58.326605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.326547 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns"] Apr 24 21:22:58.362794 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:58.362764 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c578b81_499f_4987_ac53_a52d5fc7156a.slice/crio-31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6 WatchSource:0}: Error finding container 31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6: Status 404 returned error can't find the container with id 31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6 Apr 24 21:22:58.376671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.376645 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.376875 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.376683 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sktqf\" (UniqueName: \"kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.376875 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.376715 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.377150 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.377058 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.377150 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.377072 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.387425 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.387397 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktqf\" (UniqueName: \"kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.414108 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.414082 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:22:58.421968 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.421943 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m"] Apr 24 21:22:58.424158 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:58.424130 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddaadbf_e834_45b4_8d39_300928d63db2.slice/crio-2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b WatchSource:0}: Error finding container 2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b: Status 404 returned error can't find the container with id 2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b Apr 24 21:22:58.544500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.544472 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7"] Apr 24 21:22:58.547680 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.547655 2560 generic.go:358] "Generic (PLEG): container finished" podID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerID="6196e330e301cf197fcfa438d9b2eaa1f17590422b92f47c1ee51c3f9f712482" exitCode=0 Apr 24 21:22:58.547777 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.547739 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" event={"ID":"1ddaadbf-e834-45b4-8d39-300928d63db2","Type":"ContainerDied","Data":"6196e330e301cf197fcfa438d9b2eaa1f17590422b92f47c1ee51c3f9f712482"} Apr 24 21:22:58.547817 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.547789 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" event={"ID":"1ddaadbf-e834-45b4-8d39-300928d63db2","Type":"ContainerStarted","Data":"2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b"} Apr 24 21:22:58.548977 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.548956 2560 generic.go:358] "Generic (PLEG): container finished" podID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerID="415fd4039c5e3a7e132a3a87bd7525d9f09e314d6ce0060b79b89c6d633a3317" exitCode=0 Apr 24 21:22:58.549110 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.549084 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" event={"ID":"b6f596de-438b-47d3-ab76-c5c146ee8432","Type":"ContainerDied","Data":"415fd4039c5e3a7e132a3a87bd7525d9f09e314d6ce0060b79b89c6d633a3317"} Apr 24 21:22:58.549239 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.549123 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" event={"ID":"b6f596de-438b-47d3-ab76-c5c146ee8432","Type":"ContainerStarted","Data":"2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405"} Apr 24 21:22:58.550335 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.550318 2560 generic.go:358] "Generic (PLEG): container finished" podID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerID="e573a2984ceceabdc2f2b9bced2c05de4938c90e81042335c57ab42fe15320eb" exitCode=0 Apr 24 21:22:58.550408 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.550355 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" event={"ID":"4c578b81-499f-4987-ac53-a52d5fc7156a","Type":"ContainerDied","Data":"e573a2984ceceabdc2f2b9bced2c05de4938c90e81042335c57ab42fe15320eb"} Apr 24 21:22:58.550408 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:58.550371 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" event={"ID":"4c578b81-499f-4987-ac53-a52d5fc7156a","Type":"ContainerStarted","Data":"31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6"} Apr 24 21:22:58.563799 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:22:58.563775 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86774930_e326_4ebc_ba06_e7e96af4015a.slice/crio-60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001 WatchSource:0}: Error finding container 60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001: Status 404 returned error can't find the container with id 60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001 Apr 24 21:22:59.555461 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:59.555430 2560 generic.go:358] "Generic (PLEG): container finished" podID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerID="53a2a9557c54c479f707280277bddf81029b67ee3e2f6f3eaee3ed5119ba01ac" exitCode=0 Apr 24 21:22:59.555900 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:59.555494 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" event={"ID":"b6f596de-438b-47d3-ab76-c5c146ee8432","Type":"ContainerDied","Data":"53a2a9557c54c479f707280277bddf81029b67ee3e2f6f3eaee3ed5119ba01ac"} Apr 24 21:22:59.556707 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:59.556684 2560 generic.go:358] "Generic (PLEG): container finished" podID="86774930-e326-4ebc-ba06-e7e96af4015a" containerID="de8a3da3b2e7576f3fda7a7a2932059dd013c40265c6c5bcba931f1ccc5ab736" exitCode=0 Apr 24 21:22:59.556788 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:59.556767 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" event={"ID":"86774930-e326-4ebc-ba06-e7e96af4015a","Type":"ContainerDied","Data":"de8a3da3b2e7576f3fda7a7a2932059dd013c40265c6c5bcba931f1ccc5ab736"} Apr 24 21:22:59.556838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:22:59.556804 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" event={"ID":"86774930-e326-4ebc-ba06-e7e96af4015a","Type":"ContainerStarted","Data":"60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001"} Apr 24 21:23:00.562172 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.562144 2560 generic.go:358] "Generic (PLEG): container finished" podID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerID="9927f549c2d064890b44aebe00fa082dd0e9af0a698259eb6bce272a37eb1d9c" exitCode=0 Apr 24 21:23:00.562556 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.562218 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" event={"ID":"1ddaadbf-e834-45b4-8d39-300928d63db2","Type":"ContainerDied","Data":"9927f549c2d064890b44aebe00fa082dd0e9af0a698259eb6bce272a37eb1d9c"} Apr 24 21:23:00.564082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.564061 2560 generic.go:358] "Generic (PLEG): container finished" podID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerID="999f03394982f975e604a0f8815d7dc7f81fbdd9c63d91e685a29895c8652d49" exitCode=0 Apr 24 21:23:00.564159 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.564124 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" event={"ID":"b6f596de-438b-47d3-ab76-c5c146ee8432","Type":"ContainerDied","Data":"999f03394982f975e604a0f8815d7dc7f81fbdd9c63d91e685a29895c8652d49"} Apr 24 21:23:00.565548 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.565518 2560 generic.go:358] "Generic (PLEG): container finished" podID="86774930-e326-4ebc-ba06-e7e96af4015a" containerID="da6fe73e28678b0061259e9ce5416aa79a4f1db3d843a9dc480628b5539191fc" exitCode=0 Apr 24 21:23:00.565639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.565546 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" event={"ID":"86774930-e326-4ebc-ba06-e7e96af4015a","Type":"ContainerDied","Data":"da6fe73e28678b0061259e9ce5416aa79a4f1db3d843a9dc480628b5539191fc"} Apr 24 21:23:00.567217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.567177 2560 generic.go:358] "Generic (PLEG): container finished" podID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerID="5fe1cdcfa96d88a139b3c9768589fe125855d1218b284766d2e07d7e7d33656f" exitCode=0 Apr 24 21:23:00.567217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:00.567207 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" event={"ID":"4c578b81-499f-4987-ac53-a52d5fc7156a","Type":"ContainerDied","Data":"5fe1cdcfa96d88a139b3c9768589fe125855d1218b284766d2e07d7e7d33656f"} Apr 24 21:23:01.573051 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.573018 2560 generic.go:358] "Generic (PLEG): container finished" podID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerID="15db0486bc2918adce3b9ae7c51a75ecd7a94581fff7100b50654577d3b7fcbc" exitCode=0 Apr 24 21:23:01.573466 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.573100 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" event={"ID":"1ddaadbf-e834-45b4-8d39-300928d63db2","Type":"ContainerDied","Data":"15db0486bc2918adce3b9ae7c51a75ecd7a94581fff7100b50654577d3b7fcbc"} Apr 24 21:23:01.574959 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.574913 2560 generic.go:358] "Generic (PLEG): container finished" podID="86774930-e326-4ebc-ba06-e7e96af4015a" containerID="3c8eb36d01ffdd5f118ddf74d7b61b6150faf975d8a91d4a1a7b63a3d1768e9d" exitCode=0 Apr 24 21:23:01.575077 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.574956 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" event={"ID":"86774930-e326-4ebc-ba06-e7e96af4015a","Type":"ContainerDied","Data":"3c8eb36d01ffdd5f118ddf74d7b61b6150faf975d8a91d4a1a7b63a3d1768e9d"} Apr 24 21:23:01.576713 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.576695 2560 generic.go:358] "Generic (PLEG): container finished" podID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerID="7eb2a87303b426c4400a79f16472e2d52db4a2eadc129fb239a17663052ecc30" exitCode=0 Apr 24 21:23:01.576814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.576762 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" event={"ID":"4c578b81-499f-4987-ac53-a52d5fc7156a","Type":"ContainerDied","Data":"7eb2a87303b426c4400a79f16472e2d52db4a2eadc129fb239a17663052ecc30"} Apr 24 21:23:01.692299 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.692278 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:23:01.695414 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.695398 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv29b\" (UniqueName: \"kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b\") pod \"b6f596de-438b-47d3-ab76-c5c146ee8432\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " Apr 24 21:23:01.695492 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.695445 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle\") pod \"b6f596de-438b-47d3-ab76-c5c146ee8432\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " Apr 24 21:23:01.695577 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.695557 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util\") pod \"b6f596de-438b-47d3-ab76-c5c146ee8432\" (UID: \"b6f596de-438b-47d3-ab76-c5c146ee8432\") " Apr 24 21:23:01.695887 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.695863 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle" (OuterVolumeSpecName: "bundle") pod "b6f596de-438b-47d3-ab76-c5c146ee8432" (UID: "b6f596de-438b-47d3-ab76-c5c146ee8432"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:01.697474 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.697451 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b" (OuterVolumeSpecName: "kube-api-access-cv29b") pod "b6f596de-438b-47d3-ab76-c5c146ee8432" (UID: "b6f596de-438b-47d3-ab76-c5c146ee8432"). InnerVolumeSpecName "kube-api-access-cv29b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:01.700053 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.700031 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util" (OuterVolumeSpecName: "util") pod "b6f596de-438b-47d3-ab76-c5c146ee8432" (UID: "b6f596de-438b-47d3-ab76-c5c146ee8432"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:01.796232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.796203 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv29b\" (UniqueName: \"kubernetes.io/projected/b6f596de-438b-47d3-ab76-c5c146ee8432-kube-api-access-cv29b\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:01.796232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.796230 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:01.796232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:01.796239 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6f596de-438b-47d3-ab76-c5c146ee8432-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.582501 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.582472 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" Apr 24 21:23:02.582501 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.582483 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blnkb8" event={"ID":"b6f596de-438b-47d3-ab76-c5c146ee8432","Type":"ContainerDied","Data":"2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405"} Apr 24 21:23:02.582992 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.582519 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e09af23f15135b0eb67957b98d46010e57ad5681de84d996dc39885dedd6405" Apr 24 21:23:02.720226 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.720208 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:23:02.724266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.724246 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:23:02.756109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.756087 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:23:02.802411 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802379 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle\") pod \"1ddaadbf-e834-45b4-8d39-300928d63db2\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802415 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgpl8\" (UniqueName: \"kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8\") pod \"4c578b81-499f-4987-ac53-a52d5fc7156a\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802444 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79lj\" (UniqueName: \"kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj\") pod \"1ddaadbf-e834-45b4-8d39-300928d63db2\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802460 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util\") pod \"4c578b81-499f-4987-ac53-a52d5fc7156a\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802490 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util\") pod \"1ddaadbf-e834-45b4-8d39-300928d63db2\" (UID: \"1ddaadbf-e834-45b4-8d39-300928d63db2\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802507 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util\") pod \"86774930-e326-4ebc-ba06-e7e96af4015a\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802530 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle\") pod \"86774930-e326-4ebc-ba06-e7e96af4015a\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " Apr 24 21:23:02.802575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802560 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle\") pod \"4c578b81-499f-4987-ac53-a52d5fc7156a\" (UID: \"4c578b81-499f-4987-ac53-a52d5fc7156a\") " Apr 24 21:23:02.802941 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.802588 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sktqf\" (UniqueName: \"kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf\") pod \"86774930-e326-4ebc-ba06-e7e96af4015a\" (UID: \"86774930-e326-4ebc-ba06-e7e96af4015a\") " Apr 24 21:23:02.803523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.803016 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle" (OuterVolumeSpecName: "bundle") pod "86774930-e326-4ebc-ba06-e7e96af4015a" (UID: "86774930-e326-4ebc-ba06-e7e96af4015a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.803523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.803039 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle" (OuterVolumeSpecName: "bundle") pod "1ddaadbf-e834-45b4-8d39-300928d63db2" (UID: "1ddaadbf-e834-45b4-8d39-300928d63db2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.803659 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.803566 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle" (OuterVolumeSpecName: "bundle") pod "4c578b81-499f-4987-ac53-a52d5fc7156a" (UID: "4c578b81-499f-4987-ac53-a52d5fc7156a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.805170 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.805144 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf" (OuterVolumeSpecName: "kube-api-access-sktqf") pod "86774930-e326-4ebc-ba06-e7e96af4015a" (UID: "86774930-e326-4ebc-ba06-e7e96af4015a"). InnerVolumeSpecName "kube-api-access-sktqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:02.805258 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.805223 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8" (OuterVolumeSpecName: "kube-api-access-lgpl8") pod "4c578b81-499f-4987-ac53-a52d5fc7156a" (UID: "4c578b81-499f-4987-ac53-a52d5fc7156a"). InnerVolumeSpecName "kube-api-access-lgpl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:02.805517 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.805496 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj" (OuterVolumeSpecName: "kube-api-access-t79lj") pod "1ddaadbf-e834-45b4-8d39-300928d63db2" (UID: "1ddaadbf-e834-45b4-8d39-300928d63db2"). InnerVolumeSpecName "kube-api-access-t79lj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:02.808289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.808262 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util" (OuterVolumeSpecName: "util") pod "1ddaadbf-e834-45b4-8d39-300928d63db2" (UID: "1ddaadbf-e834-45b4-8d39-300928d63db2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.808767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.808747 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util" (OuterVolumeSpecName: "util") pod "4c578b81-499f-4987-ac53-a52d5fc7156a" (UID: "4c578b81-499f-4987-ac53-a52d5fc7156a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.808856 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.808834 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util" (OuterVolumeSpecName: "util") pod "86774930-e326-4ebc-ba06-e7e96af4015a" (UID: "86774930-e326-4ebc-ba06-e7e96af4015a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:23:02.903604 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903579 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903604 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903601 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903609 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sktqf\" (UniqueName: \"kubernetes.io/projected/86774930-e326-4ebc-ba06-e7e96af4015a-kube-api-access-sktqf\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903619 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-bundle\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903628 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgpl8\" (UniqueName: \"kubernetes.io/projected/4c578b81-499f-4987-ac53-a52d5fc7156a-kube-api-access-lgpl8\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903636 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t79lj\" (UniqueName: \"kubernetes.io/projected/1ddaadbf-e834-45b4-8d39-300928d63db2-kube-api-access-t79lj\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903644 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c578b81-499f-4987-ac53-a52d5fc7156a-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903653 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ddaadbf-e834-45b4-8d39-300928d63db2-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:02.903722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:02.903660 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86774930-e326-4ebc-ba06-e7e96af4015a-util\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:23:03.590578 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.590550 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" Apr 24 21:23:03.591079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.590540 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kpd9m" event={"ID":"1ddaadbf-e834-45b4-8d39-300928d63db2","Type":"ContainerDied","Data":"2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b"} Apr 24 21:23:03.591079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.590661 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c38f6c66f4c9fad0f86702b942960337b921aea65746bdbe791bb19b668d04b" Apr 24 21:23:03.592314 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.592285 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" event={"ID":"86774930-e326-4ebc-ba06-e7e96af4015a","Type":"ContainerDied","Data":"60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001"} Apr 24 21:23:03.592314 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.592315 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c662c461642df04d70e7986d5ace5868cee06526ecbff18784ffa5e7e12001" Apr 24 21:23:03.592496 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.592291 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88rz6d7" Apr 24 21:23:03.593962 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.593938 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" event={"ID":"4c578b81-499f-4987-ac53-a52d5fc7156a","Type":"ContainerDied","Data":"31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6"} Apr 24 21:23:03.594090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.593965 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e6845f675c2adc445227e5ee6ede7c521c6c40f729343fbe52150e923c28e6" Apr 24 21:23:03.594090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:03.593970 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50367lns" Apr 24 21:23:24.380395 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380360 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt"] Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380628 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380638 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380647 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380652 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380661 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380666 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380673 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380679 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380684 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380688 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380697 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380701 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380707 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380712 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="pull" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380717 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380721 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380729 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380736 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380743 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380748 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380753 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380758 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380764 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380769 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="util" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380809 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ddaadbf-e834-45b4-8d39-300928d63db2" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380818 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c578b81-499f-4987-ac53-a52d5fc7156a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380824 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="86774930-e326-4ebc-ba06-e7e96af4015a" containerName="extract" Apr 24 21:23:24.380840 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.380830 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6f596de-438b-47d3-ab76-c5c146ee8432" containerName="extract" Apr 24 21:23:24.383594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.383577 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.385967 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.385918 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 21:23:24.386096 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.386018 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:23:24.386286 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.386272 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 21:23:24.386852 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.386838 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pqks9\"" Apr 24 21:23:24.386897 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.386855 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:23:24.398283 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.398259 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt"] Apr 24 21:23:24.449501 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.449472 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.449629 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.449509 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvzn\" (UniqueName: \"kubernetes.io/projected/0b22e842-e916-4964-b678-a49174bc3ed7-kube-api-access-4dvzn\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.449629 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.449566 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b22e842-e916-4964-b678-a49174bc3ed7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.550142 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.550110 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.550313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.550154 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvzn\" (UniqueName: \"kubernetes.io/projected/0b22e842-e916-4964-b678-a49174bc3ed7-kube-api-access-4dvzn\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.550313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.550185 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b22e842-e916-4964-b678-a49174bc3ed7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.550313 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:23:24.550251 2560 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 24 21:23:24.550478 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:23:24.550327 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert podName:0b22e842-e916-4964-b678-a49174bc3ed7 nodeName:}" failed. No retries permitted until 2026-04-24 21:23:25.050306857 +0000 UTC m=+393.109094508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-c22xt" (UID: "0b22e842-e916-4964-b678-a49174bc3ed7") : secret "plugin-serving-cert" not found Apr 24 21:23:24.550764 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.550745 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b22e842-e916-4964-b678-a49174bc3ed7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:24.562299 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:24.562277 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvzn\" (UniqueName: \"kubernetes.io/projected/0b22e842-e916-4964-b678-a49174bc3ed7-kube-api-access-4dvzn\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:25.054062 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:25.054026 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:25.056455 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:25.056424 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b22e842-e916-4964-b678-a49174bc3ed7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c22xt\" (UID: \"0b22e842-e916-4964-b678-a49174bc3ed7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:25.292354 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:25.292321 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" Apr 24 21:23:25.425436 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:25.425410 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt"] Apr 24 21:23:25.427377 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:23:25.427350 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b22e842_e916_4964_b678_a49174bc3ed7.slice/crio-cfe8d84f43e08f5d2126bb97f00b9d2dc3e2a7f285b56a435fdf3e1c6914cb66 WatchSource:0}: Error finding container cfe8d84f43e08f5d2126bb97f00b9d2dc3e2a7f285b56a435fdf3e1c6914cb66: Status 404 returned error can't find the container with id cfe8d84f43e08f5d2126bb97f00b9d2dc3e2a7f285b56a435fdf3e1c6914cb66 Apr 24 21:23:25.666048 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:25.665957 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" event={"ID":"0b22e842-e916-4964-b678-a49174bc3ed7","Type":"ContainerStarted","Data":"cfe8d84f43e08f5d2126bb97f00b9d2dc3e2a7f285b56a435fdf3e1c6914cb66"} Apr 24 21:23:30.685381 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:30.685344 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" event={"ID":"0b22e842-e916-4964-b678-a49174bc3ed7","Type":"ContainerStarted","Data":"9c3dbab87f885766337bb113abd94830708f70f4383ebf32760cdf476d5cf846"} Apr 24 21:23:30.705575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:23:30.705529 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c22xt" podStartSLOduration=2.311527072 podStartE2EDuration="6.705514289s" podCreationTimestamp="2026-04-24 21:23:24 +0000 UTC" firstStartedPulling="2026-04-24 21:23:25.428667365 +0000 UTC m=+393.487455015" lastFinishedPulling="2026-04-24 21:23:29.822654581 +0000 UTC m=+397.881442232" observedRunningTime="2026-04-24 21:23:30.702242289 +0000 UTC m=+398.761029961" watchObservedRunningTime="2026-04-24 21:23:30.705514289 +0000 UTC m=+398.764301961" Apr 24 21:24:09.216210 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.216175 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:09.239029 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.238998 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:09.239192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.239125 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:09.241495 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.241476 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-thkz5\"" Apr 24 21:24:09.344621 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.344592 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:09.347717 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.347702 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:09.354489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.354470 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:09.386913 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.386885 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chdg\" (UniqueName: \"kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg\") pod \"authorino-674b59b84c-tr7xg\" (UID: \"962d8a1d-0c16-44d5-aa07-804ca2e537a5\") " pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:09.487189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.487135 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7chdg\" (UniqueName: \"kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg\") pod \"authorino-674b59b84c-tr7xg\" (UID: \"962d8a1d-0c16-44d5-aa07-804ca2e537a5\") " pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:09.487282 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.487188 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9clg\" (UniqueName: \"kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg\") pod \"authorino-79cbc94b89-jht2m\" (UID: \"c3ea0413-b590-4a77-a320-9d4eb0eae4ca\") " pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:09.494756 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.494730 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chdg\" (UniqueName: \"kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg\") pod \"authorino-674b59b84c-tr7xg\" (UID: \"962d8a1d-0c16-44d5-aa07-804ca2e537a5\") " pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:09.548179 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.548158 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:09.588023 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.587991 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9clg\" (UniqueName: \"kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg\") pod \"authorino-79cbc94b89-jht2m\" (UID: \"c3ea0413-b590-4a77-a320-9d4eb0eae4ca\") " pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:09.597565 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.597536 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9clg\" (UniqueName: \"kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg\") pod \"authorino-79cbc94b89-jht2m\" (UID: \"c3ea0413-b590-4a77-a320-9d4eb0eae4ca\") " pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:09.657010 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.656983 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:09.665337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.665200 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:09.668235 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:24:09.668200 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod962d8a1d_0c16_44d5_aa07_804ca2e537a5.slice/crio-e7c6e7a1a4193038edef02b86d152e74788456195754feadfeff181232b781dd WatchSource:0}: Error finding container e7c6e7a1a4193038edef02b86d152e74788456195754feadfeff181232b781dd: Status 404 returned error can't find the container with id e7c6e7a1a4193038edef02b86d152e74788456195754feadfeff181232b781dd Apr 24 21:24:09.775226 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.775201 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:09.777168 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:24:09.777142 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ea0413_b590_4a77_a320_9d4eb0eae4ca.slice/crio-2e4236094e62a379e68f1809a520e440cd4f12b9ba35551f377ee8ba56969c35 WatchSource:0}: Error finding container 2e4236094e62a379e68f1809a520e440cd4f12b9ba35551f377ee8ba56969c35: Status 404 returned error can't find the container with id 2e4236094e62a379e68f1809a520e440cd4f12b9ba35551f377ee8ba56969c35 Apr 24 21:24:09.819237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.819205 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tr7xg" event={"ID":"962d8a1d-0c16-44d5-aa07-804ca2e537a5","Type":"ContainerStarted","Data":"e7c6e7a1a4193038edef02b86d152e74788456195754feadfeff181232b781dd"} Apr 24 21:24:09.820164 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:09.820136 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-jht2m" event={"ID":"c3ea0413-b590-4a77-a320-9d4eb0eae4ca","Type":"ContainerStarted","Data":"2e4236094e62a379e68f1809a520e440cd4f12b9ba35551f377ee8ba56969c35"} Apr 24 21:24:12.833152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:12.833054 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-jht2m" event={"ID":"c3ea0413-b590-4a77-a320-9d4eb0eae4ca","Type":"ContainerStarted","Data":"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167"} Apr 24 21:24:12.834388 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:12.834368 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tr7xg" event={"ID":"962d8a1d-0c16-44d5-aa07-804ca2e537a5","Type":"ContainerStarted","Data":"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81"} Apr 24 21:24:12.848311 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:12.848268 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-jht2m" podStartSLOduration=1.090523301 podStartE2EDuration="3.848256238s" podCreationTimestamp="2026-04-24 21:24:09 +0000 UTC" firstStartedPulling="2026-04-24 21:24:09.778485165 +0000 UTC m=+437.837272816" lastFinishedPulling="2026-04-24 21:24:12.536218102 +0000 UTC m=+440.595005753" observedRunningTime="2026-04-24 21:24:12.846980633 +0000 UTC m=+440.905768305" watchObservedRunningTime="2026-04-24 21:24:12.848256238 +0000 UTC m=+440.907043913" Apr 24 21:24:12.860690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:12.860641 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-tr7xg" podStartSLOduration=1.004773982 podStartE2EDuration="3.860625403s" podCreationTimestamp="2026-04-24 21:24:09 +0000 UTC" firstStartedPulling="2026-04-24 21:24:09.670025626 +0000 UTC m=+437.728813290" lastFinishedPulling="2026-04-24 21:24:12.525877058 +0000 UTC m=+440.584664711" observedRunningTime="2026-04-24 21:24:12.860092673 +0000 UTC m=+440.918880345" watchObservedRunningTime="2026-04-24 21:24:12.860625403 +0000 UTC m=+440.919413076" Apr 24 21:24:12.890800 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:12.890770 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:14.841851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:14.841810 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-tr7xg" podUID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" containerName="authorino" containerID="cri-o://4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81" gracePeriod=30 Apr 24 21:24:15.072047 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.072025 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:15.234198 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.234122 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7chdg\" (UniqueName: \"kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg\") pod \"962d8a1d-0c16-44d5-aa07-804ca2e537a5\" (UID: \"962d8a1d-0c16-44d5-aa07-804ca2e537a5\") " Apr 24 21:24:15.236036 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.236007 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg" (OuterVolumeSpecName: "kube-api-access-7chdg") pod "962d8a1d-0c16-44d5-aa07-804ca2e537a5" (UID: "962d8a1d-0c16-44d5-aa07-804ca2e537a5"). InnerVolumeSpecName "kube-api-access-7chdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:15.335492 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.335466 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7chdg\" (UniqueName: \"kubernetes.io/projected/962d8a1d-0c16-44d5-aa07-804ca2e537a5-kube-api-access-7chdg\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:24:15.846056 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.846017 2560 generic.go:358] "Generic (PLEG): container finished" podID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" containerID="4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81" exitCode=0 Apr 24 21:24:15.846452 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.846075 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tr7xg" Apr 24 21:24:15.846452 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.846102 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tr7xg" event={"ID":"962d8a1d-0c16-44d5-aa07-804ca2e537a5","Type":"ContainerDied","Data":"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81"} Apr 24 21:24:15.846452 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.846139 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tr7xg" event={"ID":"962d8a1d-0c16-44d5-aa07-804ca2e537a5","Type":"ContainerDied","Data":"e7c6e7a1a4193038edef02b86d152e74788456195754feadfeff181232b781dd"} Apr 24 21:24:15.846452 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.846156 2560 scope.go:117] "RemoveContainer" containerID="4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81" Apr 24 21:24:15.854621 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.854605 2560 scope.go:117] "RemoveContainer" containerID="4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81" Apr 24 21:24:15.854851 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:24:15.854832 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81\": container with ID starting with 4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81 not found: ID does not exist" containerID="4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81" Apr 24 21:24:15.854914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.854863 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81"} err="failed to get container status \"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81\": rpc error: code = NotFound desc = could not find container \"4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81\": container with ID starting with 4c23fc2cd18f0af71b4d0b877f8597532ca812b28b804e3d891d1c5f84b55e81 not found: ID does not exist" Apr 24 21:24:15.865527 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.865507 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:15.868851 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:15.868833 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tr7xg"] Apr 24 21:24:16.444701 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:16.444674 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" path="/var/lib/kubelet/pods/962d8a1d-0c16-44d5-aa07-804ca2e537a5/volumes" Apr 24 21:24:33.538465 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.538432 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-p8zxf"] Apr 24 21:24:33.538850 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.538688 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" containerName="authorino" Apr 24 21:24:33.538850 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.538699 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" containerName="authorino" Apr 24 21:24:33.538850 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.538748 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="962d8a1d-0c16-44d5-aa07-804ca2e537a5" containerName="authorino" Apr 24 21:24:33.542804 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.542787 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.545311 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.545290 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 24 21:24:33.549297 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.549088 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-p8zxf"] Apr 24 21:24:33.564965 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.564918 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-tls-cert\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.565058 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.564975 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flzv9\" (UniqueName: \"kubernetes.io/projected/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-kube-api-access-flzv9\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.665273 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.665243 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-tls-cert\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.665273 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.665274 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flzv9\" (UniqueName: \"kubernetes.io/projected/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-kube-api-access-flzv9\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.667560 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.667530 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-tls-cert\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.673894 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.673875 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flzv9\" (UniqueName: \"kubernetes.io/projected/a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9-kube-api-access-flzv9\") pod \"authorino-68bd676465-p8zxf\" (UID: \"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9\") " pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.851603 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.851524 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-p8zxf" Apr 24 21:24:33.969279 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:33.969254 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-p8zxf"] Apr 24 21:24:33.971657 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:24:33.971633 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ae7efd_fb4c_4d74_8d95_c0a8961a47e9.slice/crio-917fe6ca5f86362141656f24d747779b6df18231bbb561cd003f6010b641c267 WatchSource:0}: Error finding container 917fe6ca5f86362141656f24d747779b6df18231bbb561cd003f6010b641c267: Status 404 returned error can't find the container with id 917fe6ca5f86362141656f24d747779b6df18231bbb561cd003f6010b641c267 Apr 24 21:24:34.913984 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:34.913953 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-p8zxf" event={"ID":"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9","Type":"ContainerStarted","Data":"3c9df70ac389637929d7aa3532e44a5e26af6ea44ae75fde8bae0d209d614403"} Apr 24 21:24:34.914386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:34.913993 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-p8zxf" event={"ID":"a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9","Type":"ContainerStarted","Data":"917fe6ca5f86362141656f24d747779b6df18231bbb561cd003f6010b641c267"} Apr 24 21:24:34.933189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:34.933148 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-p8zxf" podStartSLOduration=1.603946155 podStartE2EDuration="1.933134051s" podCreationTimestamp="2026-04-24 21:24:33 +0000 UTC" firstStartedPulling="2026-04-24 21:24:33.972900841 +0000 UTC m=+462.031688494" lastFinishedPulling="2026-04-24 21:24:34.302088737 +0000 UTC m=+462.360876390" observedRunningTime="2026-04-24 21:24:34.93083386 +0000 UTC m=+462.989621532" watchObservedRunningTime="2026-04-24 21:24:34.933134051 +0000 UTC m=+462.991921723" Apr 24 21:24:34.965040 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:34.965013 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:34.965218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:34.965197 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-jht2m" podUID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" containerName="authorino" containerID="cri-o://fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167" gracePeriod=30 Apr 24 21:24:35.194295 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.194271 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:35.276979 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.276946 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9clg\" (UniqueName: \"kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg\") pod \"c3ea0413-b590-4a77-a320-9d4eb0eae4ca\" (UID: \"c3ea0413-b590-4a77-a320-9d4eb0eae4ca\") " Apr 24 21:24:35.278855 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.278830 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg" (OuterVolumeSpecName: "kube-api-access-t9clg") pod "c3ea0413-b590-4a77-a320-9d4eb0eae4ca" (UID: "c3ea0413-b590-4a77-a320-9d4eb0eae4ca"). InnerVolumeSpecName "kube-api-access-t9clg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:35.378178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.378133 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9clg\" (UniqueName: \"kubernetes.io/projected/c3ea0413-b590-4a77-a320-9d4eb0eae4ca-kube-api-access-t9clg\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:24:35.918591 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.918555 2560 generic.go:358] "Generic (PLEG): container finished" podID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" containerID="fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167" exitCode=0 Apr 24 21:24:35.919022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.918607 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-jht2m" Apr 24 21:24:35.919022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.918638 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-jht2m" event={"ID":"c3ea0413-b590-4a77-a320-9d4eb0eae4ca","Type":"ContainerDied","Data":"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167"} Apr 24 21:24:35.919022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.918676 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-jht2m" event={"ID":"c3ea0413-b590-4a77-a320-9d4eb0eae4ca","Type":"ContainerDied","Data":"2e4236094e62a379e68f1809a520e440cd4f12b9ba35551f377ee8ba56969c35"} Apr 24 21:24:35.919022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.918691 2560 scope.go:117] "RemoveContainer" containerID="fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167" Apr 24 21:24:35.927475 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.927459 2560 scope.go:117] "RemoveContainer" containerID="fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167" Apr 24 21:24:35.927699 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:24:35.927681 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167\": container with ID starting with fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167 not found: ID does not exist" containerID="fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167" Apr 24 21:24:35.927766 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.927711 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167"} err="failed to get container status \"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167\": rpc error: code = NotFound desc = could not find container \"fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167\": container with ID starting with fc722b8cca2e6b73de1b46b99738f42ca93c4ef71e5e83f7e14fc33df9172167 not found: ID does not exist" Apr 24 21:24:35.939167 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.939145 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:35.942342 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:35.942322 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-jht2m"] Apr 24 21:24:36.445507 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:36.445478 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" path="/var/lib/kubelet/pods/c3ea0413-b590-4a77-a320-9d4eb0eae4ca/volumes" Apr 24 21:24:53.366944 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.366894 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-4xfmc"] Apr 24 21:24:53.367323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.367205 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" containerName="authorino" Apr 24 21:24:53.367323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.367218 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" containerName="authorino" Apr 24 21:24:53.367323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.367260 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3ea0413-b590-4a77-a320-9d4eb0eae4ca" containerName="authorino" Apr 24 21:24:53.370541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.370524 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.373570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.373536 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:24:53.373706 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.373690 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:24:53.373806 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.373796 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:24:53.374558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.374540 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7skzq\"" Apr 24 21:24:53.383978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.383956 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4xfmc"] Apr 24 21:24:53.407842 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.407815 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4mt\" (UniqueName: \"kubernetes.io/projected/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-kube-api-access-fn4mt\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.407952 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.407846 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-data\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.508907 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.508877 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4mt\" (UniqueName: \"kubernetes.io/projected/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-kube-api-access-fn4mt\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.508907 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.508908 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-data\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.509248 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.509233 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-data\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.517286 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.517264 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4mt\" (UniqueName: \"kubernetes.io/projected/c24a9416-1b41-4f60-bf4e-9cc6f95eac37-kube-api-access-fn4mt\") pod \"seaweedfs-86cc847c5c-4xfmc\" (UID: \"c24a9416-1b41-4f60-bf4e-9cc6f95eac37\") " pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.679696 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.679634 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:53.798914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.798890 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4xfmc"] Apr 24 21:24:53.801081 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:24:53.801057 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24a9416_1b41_4f60_bf4e_9cc6f95eac37.slice/crio-bbd6db14555b01b48f85a2da083802cf14afd96a7a7303c44016e945d1365def WatchSource:0}: Error finding container bbd6db14555b01b48f85a2da083802cf14afd96a7a7303c44016e945d1365def: Status 404 returned error can't find the container with id bbd6db14555b01b48f85a2da083802cf14afd96a7a7303c44016e945d1365def Apr 24 21:24:53.984042 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:53.983964 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4xfmc" event={"ID":"c24a9416-1b41-4f60-bf4e-9cc6f95eac37","Type":"ContainerStarted","Data":"bbd6db14555b01b48f85a2da083802cf14afd96a7a7303c44016e945d1365def"} Apr 24 21:24:56.997936 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:56.997881 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4xfmc" event={"ID":"c24a9416-1b41-4f60-bf4e-9cc6f95eac37","Type":"ContainerStarted","Data":"ef49f80fd44182ed680ac44e2d08a628805e06f817026090da5570404934274d"} Apr 24 21:24:56.998352 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:56.997962 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:24:57.014572 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:24:57.014525 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-4xfmc" podStartSLOduration=1.438314747 podStartE2EDuration="4.014510156s" podCreationTimestamp="2026-04-24 21:24:53 +0000 UTC" firstStartedPulling="2026-04-24 21:24:53.80239505 +0000 UTC m=+481.861182714" lastFinishedPulling="2026-04-24 21:24:56.378590473 +0000 UTC m=+484.437378123" observedRunningTime="2026-04-24 21:24:57.013612726 +0000 UTC m=+485.072400401" watchObservedRunningTime="2026-04-24 21:24:57.014510156 +0000 UTC m=+485.073297828" Apr 24 21:25:03.003245 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:25:03.003213 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-4xfmc" Apr 24 21:26:21.822746 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.822716 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-8h7vn"] Apr 24 21:26:21.824754 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.824738 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8h7vn" Apr 24 21:26:21.832802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.832780 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8h7vn"] Apr 24 21:26:21.861734 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.861708 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqtl\" (UniqueName: \"kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl\") pod \"s3-init-8h7vn\" (UID: \"94e3db84-8dcf-4c23-920a-cdf7da63973e\") " pod="kserve/s3-init-8h7vn" Apr 24 21:26:21.962972 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.962937 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqtl\" (UniqueName: \"kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl\") pod \"s3-init-8h7vn\" (UID: \"94e3db84-8dcf-4c23-920a-cdf7da63973e\") " pod="kserve/s3-init-8h7vn" Apr 24 21:26:21.971911 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:21.971889 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqtl\" (UniqueName: \"kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl\") pod \"s3-init-8h7vn\" (UID: \"94e3db84-8dcf-4c23-920a-cdf7da63973e\") " pod="kserve/s3-init-8h7vn" Apr 24 21:26:22.133672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:22.133601 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8h7vn" Apr 24 21:26:22.249068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:22.249043 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8h7vn"] Apr 24 21:26:22.250805 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:26:22.250781 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94e3db84_8dcf_4c23_920a_cdf7da63973e.slice/crio-6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea WatchSource:0}: Error finding container 6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea: Status 404 returned error can't find the container with id 6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea Apr 24 21:26:22.289266 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:22.289240 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8h7vn" event={"ID":"94e3db84-8dcf-4c23-920a-cdf7da63973e","Type":"ContainerStarted","Data":"6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea"} Apr 24 21:26:27.311320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:27.311282 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8h7vn" event={"ID":"94e3db84-8dcf-4c23-920a-cdf7da63973e","Type":"ContainerStarted","Data":"5978c3a0b3628477019178d8c5e718e4805c404fb864dbaaf66fceb4cbd46f51"} Apr 24 21:26:27.327573 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:27.327528 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-8h7vn" podStartSLOduration=1.90511853 podStartE2EDuration="6.327514444s" podCreationTimestamp="2026-04-24 21:26:21 +0000 UTC" firstStartedPulling="2026-04-24 21:26:22.252643657 +0000 UTC m=+570.311431307" lastFinishedPulling="2026-04-24 21:26:26.675039572 +0000 UTC m=+574.733827221" observedRunningTime="2026-04-24 21:26:27.324702808 +0000 UTC m=+575.383490481" watchObservedRunningTime="2026-04-24 21:26:27.327514444 +0000 UTC m=+575.386302115" Apr 24 21:26:30.321330 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:30.321294 2560 generic.go:358] "Generic (PLEG): container finished" podID="94e3db84-8dcf-4c23-920a-cdf7da63973e" containerID="5978c3a0b3628477019178d8c5e718e4805c404fb864dbaaf66fceb4cbd46f51" exitCode=0 Apr 24 21:26:30.321714 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:30.321365 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8h7vn" event={"ID":"94e3db84-8dcf-4c23-920a-cdf7da63973e","Type":"ContainerDied","Data":"5978c3a0b3628477019178d8c5e718e4805c404fb864dbaaf66fceb4cbd46f51"} Apr 24 21:26:31.449744 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:31.449722 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8h7vn" Apr 24 21:26:31.536866 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:31.536824 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnqtl\" (UniqueName: \"kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl\") pod \"94e3db84-8dcf-4c23-920a-cdf7da63973e\" (UID: \"94e3db84-8dcf-4c23-920a-cdf7da63973e\") " Apr 24 21:26:31.539056 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:31.539027 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl" (OuterVolumeSpecName: "kube-api-access-fnqtl") pod "94e3db84-8dcf-4c23-920a-cdf7da63973e" (UID: "94e3db84-8dcf-4c23-920a-cdf7da63973e"). InnerVolumeSpecName "kube-api-access-fnqtl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:26:31.637798 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:31.637723 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnqtl\" (UniqueName: \"kubernetes.io/projected/94e3db84-8dcf-4c23-920a-cdf7da63973e-kube-api-access-fnqtl\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:26:32.330109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:32.330082 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8h7vn" Apr 24 21:26:32.330278 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:32.330082 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8h7vn" event={"ID":"94e3db84-8dcf-4c23-920a-cdf7da63973e","Type":"ContainerDied","Data":"6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea"} Apr 24 21:26:32.330278 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:32.330189 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8da1ccfbca4288fe4363d3cf7069e5b7f045aa92600788ced8c1c62b41adea" Apr 24 21:26:42.619128 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.619092 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx"] Apr 24 21:26:42.619568 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.619385 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94e3db84-8dcf-4c23-920a-cdf7da63973e" containerName="s3-init" Apr 24 21:26:42.619568 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.619396 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e3db84-8dcf-4c23-920a-cdf7da63973e" containerName="s3-init" Apr 24 21:26:42.619568 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.619449 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="94e3db84-8dcf-4c23-920a-cdf7da63973e" containerName="s3-init" Apr 24 21:26:42.621957 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.621916 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.625398 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.625371 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:26:42.625519 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.625417 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:26:42.625752 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.625728 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 21:26:42.626158 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.626139 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-rrd7g\"" Apr 24 21:26:42.637520 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.637497 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx"] Apr 24 21:26:42.719221 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719188 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719221 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719223 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719247 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmfqj\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-kube-api-access-wmfqj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719285 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/07c9262b-512b-4a5e-991a-8562666209e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719304 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/07c9262b-512b-4a5e-991a-8562666209e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719322 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719526 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719427 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719526 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719452 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.719526 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.719474 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820388 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820353 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820388 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820390 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820411 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820447 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820475 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820506 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmfqj\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-kube-api-access-wmfqj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820543 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/07c9262b-512b-4a5e-991a-8562666209e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820774 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820567 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/07c9262b-512b-4a5e-991a-8562666209e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820774 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820595 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820889 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820868 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.820970 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.820946 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.821049 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.821013 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.821138 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.821110 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.821342 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.821321 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/07c9262b-512b-4a5e-991a-8562666209e4-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.822890 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.822852 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/07c9262b-512b-4a5e-991a-8562666209e4-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.823034 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.823016 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/07c9262b-512b-4a5e-991a-8562666209e4-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.829223 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.829198 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.829460 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.829438 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmfqj\" (UniqueName: \"kubernetes.io/projected/07c9262b-512b-4a5e-991a-8562666209e4-kube-api-access-wmfqj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cm2xx\" (UID: \"07c9262b-512b-4a5e-991a-8562666209e4\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:42.934608 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:42.934536 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:43.061107 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:43.061081 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx"] Apr 24 21:26:43.062029 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:26:43.061999 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c9262b_512b_4a5e_991a_8562666209e4.slice/crio-c3813d1f323fa2c0d659442d6a51fa862fa9fd53505bfac847197b23d5190938 WatchSource:0}: Error finding container c3813d1f323fa2c0d659442d6a51fa862fa9fd53505bfac847197b23d5190938: Status 404 returned error can't find the container with id c3813d1f323fa2c0d659442d6a51fa862fa9fd53505bfac847197b23d5190938 Apr 24 21:26:43.367465 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:43.367426 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" event={"ID":"07c9262b-512b-4a5e-991a-8562666209e4","Type":"ContainerStarted","Data":"c3813d1f323fa2c0d659442d6a51fa862fa9fd53505bfac847197b23d5190938"} Apr 24 21:26:45.760355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:45.760316 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 24 21:26:45.760661 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:45.760395 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 24 21:26:45.760661 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:45.760453 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 24 21:26:46.380402 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:46.380321 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" event={"ID":"07c9262b-512b-4a5e-991a-8562666209e4","Type":"ContainerStarted","Data":"3c9d76b2c9b268831c4a628aab93d1fba6d784a6d7e65949f63d744d43cfa2f3"} Apr 24 21:26:46.404918 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:46.404873 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" podStartSLOduration=1.708530556 podStartE2EDuration="4.40485888s" podCreationTimestamp="2026-04-24 21:26:42 +0000 UTC" firstStartedPulling="2026-04-24 21:26:43.063779813 +0000 UTC m=+591.122567463" lastFinishedPulling="2026-04-24 21:26:45.760108134 +0000 UTC m=+593.818895787" observedRunningTime="2026-04-24 21:26:46.401619489 +0000 UTC m=+594.460407161" watchObservedRunningTime="2026-04-24 21:26:46.40485888 +0000 UTC m=+594.463646552" Apr 24 21:26:46.935251 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:46.935218 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:46.939918 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:46.939893 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:47.383738 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:47.383708 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:47.384579 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:47.384561 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cm2xx" Apr 24 21:26:52.362401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:52.362371 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:26:52.362401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:52.362408 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:26:52.363471 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:52.363451 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:26:52.363545 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:26:52.363451 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:27:15.493386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.493309 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:15.496477 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.496461 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.500112 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.500082 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec0c69dceeb48768325d1a53a749e65786-kserve-self-signed-certs\"" Apr 24 21:27:15.500225 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.500092 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:27:15.511594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.511571 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:15.677281 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677250 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677296 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677348 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfrj\" (UniqueName: \"kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677378 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677405 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677437 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.677500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.677495 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778382 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778349 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778394 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778521 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfrj\" (UniqueName: \"kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778574 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778606 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778649 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778826 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778701 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778826 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778715 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.778826 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778755 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.779020 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.778989 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.779077 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.779023 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.780741 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.780722 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.781007 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.780991 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.790694 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.790671 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfrj\" (UniqueName: \"kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.807564 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.807537 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:15.933658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.933622 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:15.937268 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:27:15.937238 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5308f447_8a0e_4789_a395_47212f6a12f4.slice/crio-659324137d4dcc94d6ec9b98ebb6933f5a0db65ad8b8fc786ba34fb2a4aca5c4 WatchSource:0}: Error finding container 659324137d4dcc94d6ec9b98ebb6933f5a0db65ad8b8fc786ba34fb2a4aca5c4: Status 404 returned error can't find the container with id 659324137d4dcc94d6ec9b98ebb6933f5a0db65ad8b8fc786ba34fb2a4aca5c4 Apr 24 21:27:15.939194 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:15.939178 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:16.480688 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:16.480650 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerStarted","Data":"659324137d4dcc94d6ec9b98ebb6933f5a0db65ad8b8fc786ba34fb2a4aca5c4"} Apr 24 21:27:20.498216 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:20.498178 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerStarted","Data":"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637"} Apr 24 21:27:24.513006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:24.512974 2560 generic.go:358] "Generic (PLEG): container finished" podID="5308f447-8a0e-4789-a395-47212f6a12f4" containerID="854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637" exitCode=0 Apr 24 21:27:24.513425 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:24.513052 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerDied","Data":"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637"} Apr 24 21:27:26.521843 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:26.521806 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerStarted","Data":"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc"} Apr 24 21:27:30.472717 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.472659 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" podStartSLOduration=5.806425177 podStartE2EDuration="15.472641402s" podCreationTimestamp="2026-04-24 21:27:15 +0000 UTC" firstStartedPulling="2026-04-24 21:27:15.939302779 +0000 UTC m=+623.998090428" lastFinishedPulling="2026-04-24 21:27:25.605518988 +0000 UTC m=+633.664306653" observedRunningTime="2026-04-24 21:27:26.544437518 +0000 UTC m=+634.603225191" watchObservedRunningTime="2026-04-24 21:27:30.472641402 +0000 UTC m=+638.531429100" Apr 24 21:27:30.474875 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.474846 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:30.475185 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.475156 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="main" containerID="cri-o://90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc" gracePeriod=30 Apr 24 21:27:30.730527 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.730472 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:30.802853 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.802822 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.802894 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.802948 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.802994 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803044 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803040 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jfrj\" (UniqueName: \"kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803077 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803105 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache\") pod \"5308f447-8a0e-4789-a395-47212f6a12f4\" (UID: \"5308f447-8a0e-4789-a395-47212f6a12f4\") " Apr 24 21:27:30.803259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803181 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home" (OuterVolumeSpecName: "home") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:30.803259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803207 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:30.803449 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803368 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.803449 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803389 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.803449 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.803403 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache" (OuterVolumeSpecName: "model-cache") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:30.805360 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.805331 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:30.805485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.805389 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj" (OuterVolumeSpecName: "kube-api-access-6jfrj") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "kube-api-access-6jfrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:27:30.805485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.805424 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm" (OuterVolumeSpecName: "dshm") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:30.867580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.867539 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5308f447-8a0e-4789-a395-47212f6a12f4" (UID: "5308f447-8a0e-4789-a395-47212f6a12f4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:30.904538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.904507 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.904538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.904537 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5308f447-8a0e-4789-a395-47212f6a12f4-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.904690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.904550 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jfrj\" (UniqueName: \"kubernetes.io/projected/5308f447-8a0e-4789-a395-47212f6a12f4-kube-api-access-6jfrj\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.904690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.904558 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:30.904690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:30.904568 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5308f447-8a0e-4789-a395-47212f6a12f4-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:27:31.540430 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.540392 2560 generic.go:358] "Generic (PLEG): container finished" podID="5308f447-8a0e-4789-a395-47212f6a12f4" containerID="90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc" exitCode=0 Apr 24 21:27:31.540430 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.540431 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerDied","Data":"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc"} Apr 24 21:27:31.540874 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.540456 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" event={"ID":"5308f447-8a0e-4789-a395-47212f6a12f4","Type":"ContainerDied","Data":"659324137d4dcc94d6ec9b98ebb6933f5a0db65ad8b8fc786ba34fb2a4aca5c4"} Apr 24 21:27:31.540874 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.540470 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954" Apr 24 21:27:31.540874 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.540477 2560 scope.go:117] "RemoveContainer" containerID="90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc" Apr 24 21:27:31.549130 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.549114 2560 scope.go:117] "RemoveContainer" containerID="854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637" Apr 24 21:27:31.563080 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.563054 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:31.567804 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:31.567782 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-b779bbb84-nl954"] Apr 24 21:27:32.327177 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:32.327149 2560 scope.go:117] "RemoveContainer" containerID="90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc" Apr 24 21:27:32.327536 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:27:32.327516 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc\": container with ID starting with 90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc not found: ID does not exist" containerID="90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc" Apr 24 21:27:32.327613 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:32.327547 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc"} err="failed to get container status \"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc\": rpc error: code = NotFound desc = could not find container \"90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc\": container with ID starting with 90afc29aa08de15c56c06d9f7f63add610452670931c7d9b12a19a22298f4efc not found: ID does not exist" Apr 24 21:27:32.327613 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:32.327568 2560 scope.go:117] "RemoveContainer" containerID="854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637" Apr 24 21:27:32.327865 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:27:32.327839 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637\": container with ID starting with 854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637 not found: ID does not exist" containerID="854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637" Apr 24 21:27:32.327973 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:32.327873 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637"} err="failed to get container status \"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637\": rpc error: code = NotFound desc = could not find container \"854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637\": container with ID starting with 854a05fed323d93435505932c89e6444ee3b6d96a87a5035d86656bb478c5637 not found: ID does not exist" Apr 24 21:27:32.448801 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:32.448769 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" path="/var/lib/kubelet/pods/5308f447-8a0e-4789-a395-47212f6a12f4/volumes" Apr 24 21:27:55.096934 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.096889 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:27:55.097313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.097275 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="main" Apr 24 21:27:55.097313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.097290 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="main" Apr 24 21:27:55.097383 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.097318 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="storage-initializer" Apr 24 21:27:55.097383 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.097326 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="storage-initializer" Apr 24 21:27:55.097451 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.097403 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="5308f447-8a0e-4789-a395-47212f6a12f4" containerName="main" Apr 24 21:27:55.101765 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.101746 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.104856 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.104836 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:27:55.104976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.104839 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 24 21:27:55.110048 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.110023 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:27:55.171552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171529 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxd2\" (UniqueName: \"kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171666 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171558 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171666 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171591 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171666 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171638 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171790 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171727 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171790 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171756 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.171881 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.171791 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272350 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272394 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272439 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272464 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272495 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272515 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxd2\" (UniqueName: \"kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272534 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272859 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272836 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.272950 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272890 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.273003 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272946 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.273003 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.272996 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.274947 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.274916 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.275183 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.275167 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.280984 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.280963 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxd2\" (UniqueName: \"kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.411622 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.411563 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:27:55.536744 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.536719 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:27:55.538300 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:27:55.538272 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dee2e9f_82b4_4084_81a1_c441cea2cbda.slice/crio-838d785ee8398d26ea71bc9db7a92791dd52b87242c8f44812270a02adf0d823 WatchSource:0}: Error finding container 838d785ee8398d26ea71bc9db7a92791dd52b87242c8f44812270a02adf0d823: Status 404 returned error can't find the container with id 838d785ee8398d26ea71bc9db7a92791dd52b87242c8f44812270a02adf0d823 Apr 24 21:27:55.620404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.620366 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" event={"ID":"0dee2e9f-82b4-4084-81a1-c441cea2cbda","Type":"ContainerStarted","Data":"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545"} Apr 24 21:27:55.620404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:27:55.620406 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" event={"ID":"0dee2e9f-82b4-4084-81a1-c441cea2cbda","Type":"ContainerStarted","Data":"838d785ee8398d26ea71bc9db7a92791dd52b87242c8f44812270a02adf0d823"} Apr 24 21:28:01.790204 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:01.790172 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:28:01.790618 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:01.790405 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" podUID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" containerName="storage-initializer" containerID="cri-o://0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545" gracePeriod=30 Apr 24 21:28:04.037739 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.037717 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:28:04.138976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.138899 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.138976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.138953 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.138986 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139033 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139055 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139122 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxd2\" (UniqueName: \"kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139162 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache\") pod \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\" (UID: \"0dee2e9f-82b4-4084-81a1-c441cea2cbda\") " Apr 24 21:28:04.139365 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139250 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:04.139365 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139261 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home" (OuterVolumeSpecName: "home") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:04.139474 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139385 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.139518 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.139473 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache" (OuterVolumeSpecName: "model-cache") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:04.141158 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.141125 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:04.141278 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.141185 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm" (OuterVolumeSpecName: "dshm") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:04.141278 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.141209 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2" (OuterVolumeSpecName: "kube-api-access-4lxd2") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "kube-api-access-4lxd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:04.204120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.204085 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0dee2e9f-82b4-4084-81a1-c441cea2cbda" (UID: "0dee2e9f-82b4-4084-81a1-c441cea2cbda"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:04.240420 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240400 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dee2e9f-82b4-4084-81a1-c441cea2cbda-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.240420 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240421 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lxd2\" (UniqueName: \"kubernetes.io/projected/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kube-api-access-4lxd2\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.240553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240432 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.240553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240441 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.240553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240450 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.240553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.240458 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dee2e9f-82b4-4084-81a1-c441cea2cbda-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:28:04.657429 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.657396 2560 generic.go:358] "Generic (PLEG): container finished" podID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" containerID="0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545" exitCode=0 Apr 24 21:28:04.657551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.657466 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" Apr 24 21:28:04.657551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.657479 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" event={"ID":"0dee2e9f-82b4-4084-81a1-c441cea2cbda","Type":"ContainerDied","Data":"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545"} Apr 24 21:28:04.657551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.657525 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss" event={"ID":"0dee2e9f-82b4-4084-81a1-c441cea2cbda","Type":"ContainerDied","Data":"838d785ee8398d26ea71bc9db7a92791dd52b87242c8f44812270a02adf0d823"} Apr 24 21:28:04.657551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.657544 2560 scope.go:117] "RemoveContainer" containerID="0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545" Apr 24 21:28:04.706015 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.705953 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:28:04.710055 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.710025 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7cc68c964cmskss"] Apr 24 21:28:04.732303 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.732285 2560 scope.go:117] "RemoveContainer" containerID="0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545" Apr 24 21:28:04.732629 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:28:04.732609 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545\": container with ID starting with 0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545 not found: ID does not exist" containerID="0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545" Apr 24 21:28:04.732687 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:04.732646 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545"} err="failed to get container status \"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545\": rpc error: code = NotFound desc = could not find container \"0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545\": container with ID starting with 0408f49ae8c4370be685ff28e9a83e818c7d3f982d9af457ef7d20ec57dcc545 not found: ID does not exist" Apr 24 21:28:06.445335 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:06.445299 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" path="/var/lib/kubelet/pods/0dee2e9f-82b4-4084-81a1-c441cea2cbda/volumes" Apr 24 21:28:12.863311 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.863266 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:28:12.863786 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.863665 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" containerName="storage-initializer" Apr 24 21:28:12.863786 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.863683 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" containerName="storage-initializer" Apr 24 21:28:12.863786 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.863771 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dee2e9f-82b4-4084-81a1-c441cea2cbda" containerName="storage-initializer" Apr 24 21:28:12.869082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.869057 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:12.872527 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.872498 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 24 21:28:12.872842 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.872820 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:28:12.880494 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:12.880469 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:28:13.005337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005302 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005341 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005378 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005410 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnqp\" (UniqueName: \"kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005430 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005487 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.005535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.005531 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.105901 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.105868 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106077 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.105907 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106138 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106111 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106205 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106188 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106255 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106224 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106293 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106253 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106294 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnqp\" (UniqueName: \"kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106303 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106343 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106325 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106509 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106443 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.106509 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.106456 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.108472 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.108448 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.108751 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.108731 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.114738 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.114684 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnqp\" (UniqueName: \"kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.180684 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.180660 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:28:13.310577 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.310396 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:28:13.314753 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:28:13.314725 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bb9951_4b95_4ea6_aeeb_0fdcbed1d016.slice/crio-b2474cb9e49a7883a6b5c07b186da2a1ee59f0dc16bb80f18348f762ec3b569b WatchSource:0}: Error finding container b2474cb9e49a7883a6b5c07b186da2a1ee59f0dc16bb80f18348f762ec3b569b: Status 404 returned error can't find the container with id b2474cb9e49a7883a6b5c07b186da2a1ee59f0dc16bb80f18348f762ec3b569b Apr 24 21:28:13.691906 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.691829 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerStarted","Data":"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718"} Apr 24 21:28:13.691906 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:28:13.691865 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerStarted","Data":"b2474cb9e49a7883a6b5c07b186da2a1ee59f0dc16bb80f18348f762ec3b569b"} Apr 24 21:29:07.878672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:07.878630 2560 generic.go:358] "Generic (PLEG): container finished" podID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerID="89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718" exitCode=0 Apr 24 21:29:07.879115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:07.878705 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerDied","Data":"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718"} Apr 24 21:29:08.883993 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:08.883958 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerStarted","Data":"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d"} Apr 24 21:29:08.905753 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:08.905684 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" podStartSLOduration=56.905664527 podStartE2EDuration="56.905664527s" podCreationTimestamp="2026-04-24 21:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:08.901140948 +0000 UTC m=+736.959928655" watchObservedRunningTime="2026-04-24 21:29:08.905664527 +0000 UTC m=+736.964452206" Apr 24 21:29:13.181177 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:13.181135 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:13.181177 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:13.181176 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:13.193489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:13.193465 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:13.911462 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:13.911429 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:23.862980 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:23.862944 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:29:23.863396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:23.863265 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="main" containerID="cri-o://8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d" gracePeriod=30 Apr 24 21:29:23.900702 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:23.900650 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 24 21:29:24.106615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.106588 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:24.158797 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158718 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqnqp\" (UniqueName: \"kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.158978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158806 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.158978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158830 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.158978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158880 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.158978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158905 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.158978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.158950 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.159265 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.159009 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home\") pod \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\" (UID: \"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016\") " Apr 24 21:29:24.159265 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.159136 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:24.159371 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.159276 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.159371 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.159286 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache" (OuterVolumeSpecName: "model-cache") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:24.159371 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.159301 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home" (OuterVolumeSpecName: "home") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:24.160913 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.160883 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp" (OuterVolumeSpecName: "kube-api-access-qqnqp") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "kube-api-access-qqnqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:24.161038 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.160950 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:24.161038 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.160965 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm" (OuterVolumeSpecName: "dshm") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:24.215186 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.215127 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" (UID: "85bb9951-4b95-4ea6-aeeb-0fdcbed1d016"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:24.260655 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260626 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqnqp\" (UniqueName: \"kubernetes.io/projected/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kube-api-access-qqnqp\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.260655 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260652 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.260799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260663 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.260799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260672 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.260799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260681 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.260799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.260688 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:29:24.942270 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.942237 2560 generic.go:358] "Generic (PLEG): container finished" podID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerID="8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d" exitCode=0 Apr 24 21:29:24.942754 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.942308 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" Apr 24 21:29:24.942754 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.942327 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerDied","Data":"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d"} Apr 24 21:29:24.942754 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.942372 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf" event={"ID":"85bb9951-4b95-4ea6-aeeb-0fdcbed1d016","Type":"ContainerDied","Data":"b2474cb9e49a7883a6b5c07b186da2a1ee59f0dc16bb80f18348f762ec3b569b"} Apr 24 21:29:24.942754 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.942399 2560 scope.go:117] "RemoveContainer" containerID="8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d" Apr 24 21:29:24.950673 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.950656 2560 scope.go:117] "RemoveContainer" containerID="89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718" Apr 24 21:29:24.961034 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.961010 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:29:24.963688 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:24.963667 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-75cd74dbc9gcnnf"] Apr 24 21:29:25.010879 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:25.010848 2560 scope.go:117] "RemoveContainer" containerID="8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d" Apr 24 21:29:25.011228 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:29:25.011205 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d\": container with ID starting with 8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d not found: ID does not exist" containerID="8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d" Apr 24 21:29:25.011315 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:25.011240 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d"} err="failed to get container status \"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d\": rpc error: code = NotFound desc = could not find container \"8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d\": container with ID starting with 8da39d62f15e0c1a979449640ca98d9f01be76e778d3ad8bb0f766d47c17093d not found: ID does not exist" Apr 24 21:29:25.011315 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:25.011261 2560 scope.go:117] "RemoveContainer" containerID="89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718" Apr 24 21:29:25.011518 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:29:25.011497 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718\": container with ID starting with 89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718 not found: ID does not exist" containerID="89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718" Apr 24 21:29:25.011575 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:25.011528 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718"} err="failed to get container status \"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718\": rpc error: code = NotFound desc = could not find container \"89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718\": container with ID starting with 89db57d4f1b4ac00d35a159da5c46beb202cbc6977d96ea0d31528ff3a154718 not found: ID does not exist" Apr 24 21:29:26.445751 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.445720 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" path="/var/lib/kubelet/pods/85bb9951-4b95-4ea6-aeeb-0fdcbed1d016/volumes" Apr 24 21:29:26.580690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.580657 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:29:26.581115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.581087 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="storage-initializer" Apr 24 21:29:26.581115 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.581111 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="storage-initializer" Apr 24 21:29:26.581305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.581142 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="main" Apr 24 21:29:26.581305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.581152 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="main" Apr 24 21:29:26.581305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.581218 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="85bb9951-4b95-4ea6-aeeb-0fdcbed1d016" containerName="main" Apr 24 21:29:26.586125 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.586081 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.588461 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.588437 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:29:26.588571 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.588494 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:29:26.596512 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.596492 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:29:26.681218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681180 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681403 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681235 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681403 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681285 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctbcl\" (UniqueName: \"kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681403 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681345 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681403 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681395 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681565 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681416 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.681565 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.681446 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.782735 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.782690 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.782735 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.782741 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.782995 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.782860 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.782995 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.782955 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783003 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783033 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctbcl\" (UniqueName: \"kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783083 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783093 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783319 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783219 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783370 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783314 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.783370 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.783342 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.785006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.784984 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.785368 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.785349 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.797010 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.796985 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctbcl\" (UniqueName: \"kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:26.897669 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:26.897634 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:29:27.019068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:27.019041 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:29:27.020767 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:29:27.020738 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f4a872f_08e2_45f2_869f_b40976f13efa.slice/crio-9693bc7f058bef6d3feba80b7ef4d48575bdf7429d068793d2c569cf889af9bf WatchSource:0}: Error finding container 9693bc7f058bef6d3feba80b7ef4d48575bdf7429d068793d2c569cf889af9bf: Status 404 returned error can't find the container with id 9693bc7f058bef6d3feba80b7ef4d48575bdf7429d068793d2c569cf889af9bf Apr 24 21:29:27.956814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:27.956780 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerStarted","Data":"46e351dca4eec5e14724b4440713bb85cf3ced1d12bb5180c2a48350e4d033da"} Apr 24 21:29:27.956814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:29:27.956815 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerStarted","Data":"9693bc7f058bef6d3feba80b7ef4d48575bdf7429d068793d2c569cf889af9bf"} Apr 24 21:30:17.408287 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.408212 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:30:17.411377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.411362 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.413639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.413618 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-t7b2b\"" Apr 24 21:30:17.413745 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.413622 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:30:17.425500 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.425477 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:30:17.559835 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.559798 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.559835 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.559837 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.560094 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.559875 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.560094 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.559901 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzjct\" (UniqueName: \"kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.560094 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.559996 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.560094 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.560024 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661230 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661273 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661310 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661328 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661356 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661373 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzjct\" (UniqueName: \"kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661674 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661653 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661672 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661762 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661717 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.661813 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.661796 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.663645 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.663624 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.668815 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.668795 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzjct\" (UniqueName: \"kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.720400 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.720377 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:17.877855 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:17.877827 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:30:17.879188 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:30:17.879160 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cfb37c_464d_48d7_bb12_fb57061b8e43.slice/crio-31e57d82a5bd7d1973f823f90cb7ed34616c017abccab15559accd9c69934f23 WatchSource:0}: Error finding container 31e57d82a5bd7d1973f823f90cb7ed34616c017abccab15559accd9c69934f23: Status 404 returned error can't find the container with id 31e57d82a5bd7d1973f823f90cb7ed34616c017abccab15559accd9c69934f23 Apr 24 21:30:18.138650 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:18.138613 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerStarted","Data":"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d"} Apr 24 21:30:18.138650 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:18.138649 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerStarted","Data":"31e57d82a5bd7d1973f823f90cb7ed34616c017abccab15559accd9c69934f23"} Apr 24 21:30:19.143905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:19.143870 2560 generic.go:358] "Generic (PLEG): container finished" podID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerID="de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d" exitCode=0 Apr 24 21:30:19.144315 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:19.143965 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerDied","Data":"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d"} Apr 24 21:30:21.154878 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:21.154830 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerStarted","Data":"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310"} Apr 24 21:30:50.270359 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:50.270325 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerStarted","Data":"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d"} Apr 24 21:30:50.270865 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:50.270492 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:50.272805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:50.272764 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:30:50.293866 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:50.293821 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podStartSLOduration=2.81935836 podStartE2EDuration="33.293808845s" podCreationTimestamp="2026-04-24 21:30:17 +0000 UTC" firstStartedPulling="2026-04-24 21:30:19.144961427 +0000 UTC m=+807.203749077" lastFinishedPulling="2026-04-24 21:30:49.619411896 +0000 UTC m=+837.678199562" observedRunningTime="2026-04-24 21:30:50.290563109 +0000 UTC m=+838.349350784" watchObservedRunningTime="2026-04-24 21:30:50.293808845 +0000 UTC m=+838.352596517" Apr 24 21:30:51.275155 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:51.275118 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:30:54.288939 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:54.288884 2560 generic.go:358] "Generic (PLEG): container finished" podID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerID="46e351dca4eec5e14724b4440713bb85cf3ced1d12bb5180c2a48350e4d033da" exitCode=0 Apr 24 21:30:54.289386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:54.288958 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerDied","Data":"46e351dca4eec5e14724b4440713bb85cf3ced1d12bb5180c2a48350e4d033da"} Apr 24 21:30:57.721243 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:57.721206 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:57.721703 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:57.721356 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.38:8082/healthz\": dial tcp 10.134.0.38:8082: connect: connection refused" Apr 24 21:30:57.721703 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:57.721416 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:30:57.724778 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:57.724709 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:30:58.307285 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:30:58.307251 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:07.723157 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:07.723124 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:31:07.724430 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:07.724405 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:31:07.724558 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:07.724525 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:08.349740 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:08.349702 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:18.350167 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:18.350123 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:28.350124 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:28.350081 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:36.457271 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:36.457236 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerStarted","Data":"a87e9c88851b4c28933242aca14f004bb19eb18df1b69cd0fd893506f3af37a6"} Apr 24 21:31:36.478860 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:36.478805 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podStartSLOduration=88.531201951 podStartE2EDuration="2m10.478789095s" podCreationTimestamp="2026-04-24 21:29:26 +0000 UTC" firstStartedPulling="2026-04-24 21:30:54.290090235 +0000 UTC m=+842.348877886" lastFinishedPulling="2026-04-24 21:31:36.237677375 +0000 UTC m=+884.296465030" observedRunningTime="2026-04-24 21:31:36.475287028 +0000 UTC m=+884.534074698" watchObservedRunningTime="2026-04-24 21:31:36.478789095 +0000 UTC m=+884.537576767" Apr 24 21:31:36.897844 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:36.897800 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:31:36.897844 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:36.897847 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:31:36.899448 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:36.899415 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:31:38.350189 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:38.350145 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:46.898954 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:46.898837 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:31:48.350424 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:48.350383 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:31:52.390943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:52.390889 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:31:52.392016 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:52.391995 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:31:52.392141 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:52.392020 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:31:52.392904 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:52.392889 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:31:56.898724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:56.898677 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:31:58.350160 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:31:58.350112 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:06.898089 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:06.898047 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:08.349894 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:08.349849 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:16.898677 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:16.898628 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:17.442699 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:17.442666 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:26.898798 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:26.898753 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:27.443789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:27.443745 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:36.898323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:36.898277 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:37.442882 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:37.442845 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:46.898828 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:46.898783 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:47.443169 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:47.443128 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:32:56.898319 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:56.898267 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 24 21:32:57.443385 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:32:57.443346 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:06.907720 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:06.907687 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:33:06.915096 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:06.915074 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:33:07.442771 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:07.442733 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:12.710367 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:12.710295 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:33:12.710726 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:12.710613 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" containerID="cri-o://a87e9c88851b4c28933242aca14f004bb19eb18df1b69cd0fd893506f3af37a6" gracePeriod=30 Apr 24 21:33:17.443341 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:17.443306 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:26.791632 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.791599 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:33:26.793717 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.793700 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.796275 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.796257 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 21:33:26.804695 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.804670 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:33:26.829673 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829648 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829681 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829702 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829732 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpdpq\" (UniqueName: \"kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829792 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829774 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829983 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829810 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.829983 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.829838 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931146 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931116 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931146 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931150 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931171 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpdpq\" (UniqueName: \"kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931196 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931248 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931503 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931408 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931574 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931511 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931574 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931529 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931613 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931769 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931742 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.931850 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.931832 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.933415 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.933394 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.933563 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.933547 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:26.939151 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:26.939130 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpdpq\" (UniqueName: \"kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq\") pod \"custom-route-timeout-test-kserve-6944cc8dd7-qjrjw\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:27.103815 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.103727 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:27.231739 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.231577 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:33:27.234619 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:33:27.234590 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5c6a98_e903_4858_9420_0cf3a0ca6a4e.slice/crio-047324f5c3e5c639ef27788f7558c975d75962b6b38d6e11f4f6b82a5e9a41db WatchSource:0}: Error finding container 047324f5c3e5c639ef27788f7558c975d75962b6b38d6e11f4f6b82a5e9a41db: Status 404 returned error can't find the container with id 047324f5c3e5c639ef27788f7558c975d75962b6b38d6e11f4f6b82a5e9a41db Apr 24 21:33:27.236757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.236742 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:33:27.443451 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.443350 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:27.847339 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.847306 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerStarted","Data":"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d"} Apr 24 21:33:27.847339 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:27.847343 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerStarted","Data":"047324f5c3e5c639ef27788f7558c975d75962b6b38d6e11f4f6b82a5e9a41db"} Apr 24 21:33:31.867729 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:31.867688 2560 generic.go:358] "Generic (PLEG): container finished" podID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerID="c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d" exitCode=0 Apr 24 21:33:31.868169 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:31.867763 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerDied","Data":"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d"} Apr 24 21:33:32.873316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:32.873272 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerStarted","Data":"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d"} Apr 24 21:33:32.896753 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:32.896665 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podStartSLOduration=6.8966466650000005 podStartE2EDuration="6.896646665s" podCreationTimestamp="2026-04-24 21:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:32.894595842 +0000 UTC m=+1000.953383514" watchObservedRunningTime="2026-04-24 21:33:32.896646665 +0000 UTC m=+1000.955434338" Apr 24 21:33:37.104166 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:37.104120 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:37.104634 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:37.104273 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:33:37.105747 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:37.105720 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:33:37.443385 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:37.443298 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:38.443229 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:38.443194 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:42.912485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:42.912460 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56_9f4a872f-08e2-45f2-869f-b40976f13efa/main/0.log" Apr 24 21:33:42.912917 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:42.912790 2560 generic.go:358] "Generic (PLEG): container finished" podID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerID="a87e9c88851b4c28933242aca14f004bb19eb18df1b69cd0fd893506f3af37a6" exitCode=137 Apr 24 21:33:42.912917 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:42.912875 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerDied","Data":"a87e9c88851b4c28933242aca14f004bb19eb18df1b69cd0fd893506f3af37a6"} Apr 24 21:33:42.973793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:42.973739 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56_9f4a872f-08e2-45f2-869f-b40976f13efa/main/0.log" Apr 24 21:33:42.974091 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:42.974074 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:33:43.070499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070461 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070529 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070563 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070586 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070603 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070650 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctbcl\" (UniqueName: \"kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.070693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070682 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs\") pod \"9f4a872f-08e2-45f2-869f-b40976f13efa\" (UID: \"9f4a872f-08e2-45f2-869f-b40976f13efa\") " Apr 24 21:33:43.071047 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070700 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache" (OuterVolumeSpecName: "model-cache") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:43.071047 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.070889 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.071298 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.071246 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home" (OuterVolumeSpecName: "home") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:43.072757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.072733 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm" (OuterVolumeSpecName: "dshm") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:43.073441 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.073412 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl" (OuterVolumeSpecName: "kube-api-access-ctbcl") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "kube-api-access-ctbcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:43.073538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.073509 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:43.082939 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.082889 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:43.126332 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.126296 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f4a872f-08e2-45f2-869f-b40976f13efa" (UID: "9f4a872f-08e2-45f2-869f-b40976f13efa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:43.172230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172204 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.172230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172228 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.172333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172237 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.172333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172248 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f4a872f-08e2-45f2-869f-b40976f13efa-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.172333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172257 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctbcl\" (UniqueName: \"kubernetes.io/projected/9f4a872f-08e2-45f2-869f-b40976f13efa-kube-api-access-ctbcl\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.172333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.172266 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a872f-08e2-45f2-869f-b40976f13efa-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:33:43.918538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.918516 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56_9f4a872f-08e2-45f2-869f-b40976f13efa/main/0.log" Apr 24 21:33:43.919011 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.918966 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" Apr 24 21:33:43.919011 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.918969 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56" event={"ID":"9f4a872f-08e2-45f2-869f-b40976f13efa","Type":"ContainerDied","Data":"9693bc7f058bef6d3feba80b7ef4d48575bdf7429d068793d2c569cf889af9bf"} Apr 24 21:33:43.919140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.919025 2560 scope.go:117] "RemoveContainer" containerID="a87e9c88851b4c28933242aca14f004bb19eb18df1b69cd0fd893506f3af37a6" Apr 24 21:33:43.927987 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.927968 2560 scope.go:117] "RemoveContainer" containerID="46e351dca4eec5e14724b4440713bb85cf3ced1d12bb5180c2a48350e4d033da" Apr 24 21:33:43.943249 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.943226 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:33:43.948244 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:43.948222 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-65f5c5ddf54pq56"] Apr 24 21:33:44.446081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:44.446046 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" path="/var/lib/kubelet/pods/9f4a872f-08e2-45f2-869f-b40976f13efa/volumes" Apr 24 21:33:47.105136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:47.105089 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:33:48.449361 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:48.449321 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:33:57.104453 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:57.104406 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:33:58.443246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:33:58.443208 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:07.104595 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:07.104551 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:34:08.443312 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:08.443271 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:17.104478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:17.104435 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:34:18.443738 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:18.443704 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:27.104541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:27.104498 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:34:28.443478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:28.443432 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:37.104232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:37.104190 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:34:38.443581 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:38.443539 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:44.443440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:44.443361 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:47.104577 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:47.104525 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:34:54.443447 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:54.443391 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:34:57.104580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:34:57.104533 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 24 21:35:04.443970 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:04.443912 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:07.114022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:07.113989 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:35:07.121636 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:07.121608 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:35:12.581872 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:12.581819 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:35:12.582490 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:12.582196 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" containerID="cri-o://9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d" gracePeriod=30 Apr 24 21:35:14.443384 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:14.443340 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:17.580773 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.580740 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:35:17.581140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.581053 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="storage-initializer" Apr 24 21:35:17.581140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.581064 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="storage-initializer" Apr 24 21:35:17.581140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.581073 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" Apr 24 21:35:17.581140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.581079 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" Apr 24 21:35:17.581140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.581121 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f4a872f-08e2-45f2-869f-b40976f13efa" containerName="main" Apr 24 21:35:17.582832 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.582813 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.585203 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.585184 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:35:17.594839 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.594814 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:35:17.669340 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669306 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkthm\" (UniqueName: \"kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669350 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669370 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669389 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669429 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669490 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.669683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.669526 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.770816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770777 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkthm\" (UniqueName: \"kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770831 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770857 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770878 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770911 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.770990 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771217 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.771024 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771549 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.771311 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771549 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.771341 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771549 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.771392 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.771549 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.771452 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.773285 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.773259 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.773525 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.773507 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.778144 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.778126 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkthm\" (UniqueName: \"kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm\") pod \"router-with-refs-test-kserve-86f4fff555-k5gn5\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:17.893354 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:17.893269 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:18.024825 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:18.024792 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:35:18.026656 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:35:18.026628 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod320d920b_8b88_4531_9801_47ba810f9086.slice/crio-82b7cbca965473f798fb5adf8f27656229a14e866d3b60af44065f05ef67fc5f WatchSource:0}: Error finding container 82b7cbca965473f798fb5adf8f27656229a14e866d3b60af44065f05ef67fc5f: Status 404 returned error can't find the container with id 82b7cbca965473f798fb5adf8f27656229a14e866d3b60af44065f05ef67fc5f Apr 24 21:35:18.263546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:18.263510 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerStarted","Data":"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a"} Apr 24 21:35:18.263546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:18.263550 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerStarted","Data":"82b7cbca965473f798fb5adf8f27656229a14e866d3b60af44065f05ef67fc5f"} Apr 24 21:35:22.280520 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:22.280477 2560 generic.go:358] "Generic (PLEG): container finished" podID="320d920b-8b88-4531-9801-47ba810f9086" containerID="a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a" exitCode=0 Apr 24 21:35:22.280859 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:22.280551 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerDied","Data":"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a"} Apr 24 21:35:23.289213 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:23.289178 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerStarted","Data":"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9"} Apr 24 21:35:23.310156 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:23.310111 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podStartSLOduration=6.310097552 podStartE2EDuration="6.310097552s" podCreationTimestamp="2026-04-24 21:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:23.309120523 +0000 UTC m=+1111.367908204" watchObservedRunningTime="2026-04-24 21:35:23.310097552 +0000 UTC m=+1111.368885224" Apr 24 21:35:24.443773 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:24.443730 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:27.893835 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:27.893797 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:27.894227 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:27.893850 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:35:27.895572 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:27.895546 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:35:34.444091 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:34.444048 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:37.893995 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:37.893955 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:35:42.861721 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.861695 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6944cc8dd7-qjrjw_5b5c6a98-e903-4858-9420-0cf3a0ca6a4e/main/0.log" Apr 24 21:35:42.862095 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.862076 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:35:42.988845 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.988805 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpdpq\" (UniqueName: \"kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.988878 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.988914 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.988975 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.989006 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.989031 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.989111 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location\") pod \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\" (UID: \"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e\") " Apr 24 21:35:42.989693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.989647 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home" (OuterVolumeSpecName: "home") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:42.991104 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.989194 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache" (OuterVolumeSpecName: "model-cache") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:42.991479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.991453 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:42.991604 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.991563 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm" (OuterVolumeSpecName: "dshm") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:42.991665 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:42.991642 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq" (OuterVolumeSpecName: "kube-api-access-cpdpq") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "kube-api-access-cpdpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:43.002971 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.002946 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:43.044351 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.044320 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" (UID: "5b5c6a98-e903-4858-9420-0cf3a0ca6a4e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:43.089692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089668 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089700 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089710 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089720 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089730 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpdpq\" (UniqueName: \"kubernetes.io/projected/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-kube-api-access-cpdpq\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089739 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.089793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.089747 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.362308 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362284 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6944cc8dd7-qjrjw_5b5c6a98-e903-4858-9420-0cf3a0ca6a4e/main/0.log" Apr 24 21:35:43.362662 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362639 2560 generic.go:358] "Generic (PLEG): container finished" podID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerID="9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d" exitCode=137 Apr 24 21:35:43.362735 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362720 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" Apr 24 21:35:43.362777 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362741 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerDied","Data":"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d"} Apr 24 21:35:43.362812 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362794 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw" event={"ID":"5b5c6a98-e903-4858-9420-0cf3a0ca6a4e","Type":"ContainerDied","Data":"047324f5c3e5c639ef27788f7558c975d75962b6b38d6e11f4f6b82a5e9a41db"} Apr 24 21:35:43.362845 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.362817 2560 scope.go:117] "RemoveContainer" containerID="9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d" Apr 24 21:35:43.372470 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.372448 2560 scope.go:117] "RemoveContainer" containerID="c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d" Apr 24 21:35:43.383634 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.383617 2560 scope.go:117] "RemoveContainer" containerID="9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d" Apr 24 21:35:43.383870 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:35:43.383853 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d\": container with ID starting with 9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d not found: ID does not exist" containerID="9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d" Apr 24 21:35:43.383934 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.383878 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d"} err="failed to get container status \"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d\": rpc error: code = NotFound desc = could not find container \"9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d\": container with ID starting with 9723a9974595b941753f3ce5e3b3007ede8a0e1cf6a4a00cd3ecc4bdf00b865d not found: ID does not exist" Apr 24 21:35:43.383934 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.383895 2560 scope.go:117] "RemoveContainer" containerID="c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d" Apr 24 21:35:43.384126 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:35:43.384108 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d\": container with ID starting with c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d not found: ID does not exist" containerID="c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d" Apr 24 21:35:43.384170 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.384131 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d"} err="failed to get container status \"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d\": rpc error: code = NotFound desc = could not find container \"c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d\": container with ID starting with c21bae1ed57b86bdf22f0572e9dedf0be21d6888b8cfdc9976360cad25080e6d not found: ID does not exist" Apr 24 21:35:43.401019 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.400994 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:35:43.407847 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:43.407828 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6944cc8dd7-qjrjw"] Apr 24 21:35:44.443241 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:44.443194 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:44.445399 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:44.445373 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" path="/var/lib/kubelet/pods/5b5c6a98-e903-4858-9420-0cf3a0ca6a4e/volumes" Apr 24 21:35:47.894502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:47.894463 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:35:54.444167 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:54.444123 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:35:57.894100 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:57.894057 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:35:58.442750 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:35:58.442710 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:07.893849 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:07.893806 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:36:08.443187 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:08.443141 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:17.894739 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:17.894690 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:36:18.443257 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:18.443217 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:27.894550 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:27.894498 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:36:28.442948 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:28.442882 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:37.894381 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:37.894335 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:36:38.443718 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:38.443681 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:47.894033 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:47.893994 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 24 21:36:48.443676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:48.443638 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:36:52.417247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:52.417213 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:36:52.418311 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:52.418290 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:36:52.419613 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:52.419596 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:36:52.420431 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:52.420416 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:36:57.903582 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:57.903547 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:36:57.911244 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:57.911222 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:36:58.442633 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:36:58.442589 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:03.408383 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:03.408340 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:37:03.408838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:03.408624 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" containerID="cri-o://35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9" gracePeriod=30 Apr 24 21:37:05.442132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:05.442095 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:15.442384 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:15.442346 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:19.628094 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628054 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:37:19.628681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628473 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="storage-initializer" Apr 24 21:37:19.628681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628490 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="storage-initializer" Apr 24 21:37:19.628681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628520 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" Apr 24 21:37:19.628681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628529 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" Apr 24 21:37:19.628681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.628614 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b5c6a98-e903-4858-9420-0cf3a0ca6a4e" containerName="main" Apr 24 21:37:19.630689 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.630667 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.633313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.633290 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:37:19.633449 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.633363 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-4q5kd\"" Apr 24 21:37:19.648971 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.648946 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:37:19.652401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.652376 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:37:19.654582 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.654565 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.668188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.668168 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:37:19.739639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739609 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.739796 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739642 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.739796 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.739796 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739683 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.739796 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739753 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.739796 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739793 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.740072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739812 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.740072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739863 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.740072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.739888 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvt2\" (UniqueName: \"kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.740072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.740001 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.740072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.740053 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trl74\" (UniqueName: \"kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.740292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.740082 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.740292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.740107 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.740292 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.740152 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.841582 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841491 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.841755 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841620 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.841755 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841650 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.841755 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841695 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.841755 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841735 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvt2\" (UniqueName: \"kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841805 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841842 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trl74\" (UniqueName: \"kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841874 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841900 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.841959 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842005 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842035 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842062 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842099 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842141 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842143 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842247 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842172 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842370 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842428 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.842580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842436 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842483 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.842580 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.842491 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.844274 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.844251 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.844382 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.844283 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.844674 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.844657 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.844811 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.844717 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.851035 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.851008 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trl74\" (UniqueName: \"kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:19.851835 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.851808 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvt2\" (UniqueName: \"kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.941412 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.941330 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:19.967239 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:19.967207 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:37:20.084310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:20.084266 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:37:20.086804 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:37:20.086774 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c53ce2_2559_4384_9433_c0292f470550.slice/crio-7402a666dea25d8cb1e0a0db541b46e6b5eac419dd55099cd6b42136ce278410 WatchSource:0}: Error finding container 7402a666dea25d8cb1e0a0db541b46e6b5eac419dd55099cd6b42136ce278410: Status 404 returned error can't find the container with id 7402a666dea25d8cb1e0a0db541b46e6b5eac419dd55099cd6b42136ce278410 Apr 24 21:37:20.102263 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:37:20.102233 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cda0b52_5413_49e9_a85e_3f7ab69677c0.slice/crio-95cf6bf47f1d81de0ba91b07734f93d7a28ef68818452b4daf021c3201b1e74a WatchSource:0}: Error finding container 95cf6bf47f1d81de0ba91b07734f93d7a28ef68818452b4daf021c3201b1e74a: Status 404 returned error can't find the container with id 95cf6bf47f1d81de0ba91b07734f93d7a28ef68818452b4daf021c3201b1e74a Apr 24 21:37:20.102389 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:20.102307 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:37:20.722941 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:20.722866 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerStarted","Data":"7402a666dea25d8cb1e0a0db541b46e6b5eac419dd55099cd6b42136ce278410"} Apr 24 21:37:20.724506 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:20.724482 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerStarted","Data":"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802"} Apr 24 21:37:20.724640 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:20.724514 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerStarted","Data":"95cf6bf47f1d81de0ba91b07734f93d7a28ef68818452b4daf021c3201b1e74a"} Apr 24 21:37:21.729809 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:21.729775 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerStarted","Data":"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2"} Apr 24 21:37:21.730244 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:21.729882 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:22.734681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:22.734646 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerStarted","Data":"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6"} Apr 24 21:37:25.443265 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:25.443224 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:26.751987 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:26.751954 2560 generic.go:358] "Generic (PLEG): container finished" podID="63c53ce2-2559-4384-9433-c0292f470550" containerID="e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6" exitCode=0 Apr 24 21:37:26.752358 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:26.752012 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerDied","Data":"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6"} Apr 24 21:37:27.757978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:27.757944 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerStarted","Data":"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b"} Apr 24 21:37:27.795422 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:27.795374 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podStartSLOduration=7.986159558 podStartE2EDuration="8.795360263s" podCreationTimestamp="2026-04-24 21:37:19 +0000 UTC" firstStartedPulling="2026-04-24 21:37:20.088489271 +0000 UTC m=+1228.147276922" lastFinishedPulling="2026-04-24 21:37:20.897689977 +0000 UTC m=+1228.956477627" observedRunningTime="2026-04-24 21:37:27.790640962 +0000 UTC m=+1235.849428634" watchObservedRunningTime="2026-04-24 21:37:27.795360263 +0000 UTC m=+1235.854147934" Apr 24 21:37:29.942126 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:29.942090 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:29.942126 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:29.942132 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:29.943624 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:29.943587 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:37:33.684521 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.684499 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-86f4fff555-k5gn5_320d920b-8b88-4531-9801-47ba810f9086/main/0.log" Apr 24 21:37:33.684955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.684916 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:37:33.760155 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760123 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760171 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760202 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760225 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkthm\" (UniqueName: \"kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760249 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760277 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760325 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home\") pod \"320d920b-8b88-4531-9801-47ba810f9086\" (UID: \"320d920b-8b88-4531-9801-47ba810f9086\") " Apr 24 21:37:33.760653 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760464 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache" (OuterVolumeSpecName: "model-cache") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:33.760653 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.760603 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.761127 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.761092 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home" (OuterVolumeSpecName: "home") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:33.762513 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.762486 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:33.762630 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.762509 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm" (OuterVolumeSpecName: "kube-api-access-nkthm") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "kube-api-access-nkthm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:33.762630 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.762614 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm" (OuterVolumeSpecName: "dshm") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:33.779168 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.779127 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:33.782560 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.782536 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-86f4fff555-k5gn5_320d920b-8b88-4531-9801-47ba810f9086/main/0.log" Apr 24 21:37:33.782899 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.782863 2560 generic.go:358] "Generic (PLEG): container finished" podID="320d920b-8b88-4531-9801-47ba810f9086" containerID="35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9" exitCode=137 Apr 24 21:37:33.783016 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.782954 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerDied","Data":"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9"} Apr 24 21:37:33.783016 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.782975 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" Apr 24 21:37:33.783016 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.782998 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5" event={"ID":"320d920b-8b88-4531-9801-47ba810f9086","Type":"ContainerDied","Data":"82b7cbca965473f798fb5adf8f27656229a14e866d3b60af44065f05ef67fc5f"} Apr 24 21:37:33.783133 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.783018 2560 scope.go:117] "RemoveContainer" containerID="35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9" Apr 24 21:37:33.791443 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.791424 2560 scope.go:117] "RemoveContainer" containerID="a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a" Apr 24 21:37:33.829414 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.829355 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "320d920b-8b88-4531-9801-47ba810f9086" (UID: "320d920b-8b88-4531-9801-47ba810f9086"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:33.854562 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.854542 2560 scope.go:117] "RemoveContainer" containerID="35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9" Apr 24 21:37:33.854886 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:37:33.854859 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9\": container with ID starting with 35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9 not found: ID does not exist" containerID="35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9" Apr 24 21:37:33.854966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.854899 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9"} err="failed to get container status \"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9\": rpc error: code = NotFound desc = could not find container \"35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9\": container with ID starting with 35b9851c13e54017e7414ae1bee8cca1dcc1470ff77afde1d81ec9fdede904e9 not found: ID does not exist" Apr 24 21:37:33.854966 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.854935 2560 scope.go:117] "RemoveContainer" containerID="a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a" Apr 24 21:37:33.855231 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:37:33.855212 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a\": container with ID starting with a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a not found: ID does not exist" containerID="a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a" Apr 24 21:37:33.855283 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.855235 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a"} err="failed to get container status \"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a\": rpc error: code = NotFound desc = could not find container \"a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a\": container with ID starting with a036f0e2d9b069997b73d973617d80fa4ebed4fe998c1f5f1fab610788071f1a not found: ID does not exist" Apr 24 21:37:33.861757 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861740 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkthm\" (UniqueName: \"kubernetes.io/projected/320d920b-8b88-4531-9801-47ba810f9086-kube-api-access-nkthm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.861816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861763 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.861816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861778 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.861816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861789 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.861816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861797 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/320d920b-8b88-4531-9801-47ba810f9086-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:33.861816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:33.861804 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/320d920b-8b88-4531-9801-47ba810f9086-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:37:34.107594 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:34.107564 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:37:34.112462 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:34.112438 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-86f4fff555-k5gn5"] Apr 24 21:37:34.445714 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:34.445621 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320d920b-8b88-4531-9801-47ba810f9086" path="/var/lib/kubelet/pods/320d920b-8b88-4531-9801-47ba810f9086/volumes" Apr 24 21:37:35.442781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:35.442735 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:39.942471 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:39.942425 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:37:39.955079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:39.955049 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:37:45.443386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:45.443306 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:49.942246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:49.942187 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:37:55.442524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:55.442480 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:37:59.942382 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:37:59.942329 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:05.442412 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:05.442369 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:06.442671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:06.442631 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:09.942048 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:09.942001 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:15.955238 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:15.955184 2560 generic.go:358] "Generic (PLEG): container finished" podID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerID="fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802" exitCode=0 Apr 24 21:38:15.955699 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:15.955241 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerDied","Data":"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802"} Apr 24 21:38:16.442785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:16.442742 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:16.961095 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:16.961051 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerStarted","Data":"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2"} Apr 24 21:38:16.989072 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:16.988993 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podStartSLOduration=57.988969097 podStartE2EDuration="57.988969097s" podCreationTimestamp="2026-04-24 21:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:16.982959747 +0000 UTC m=+1285.041747421" watchObservedRunningTime="2026-04-24 21:38:16.988969097 +0000 UTC m=+1285.047756771" Apr 24 21:38:19.943002 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:19.942825 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:19.967316 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:19.967281 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:38:19.967516 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:19.967332 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:38:19.968706 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:19.968667 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:38:26.443426 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:26.443379 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:29.942421 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:29.942368 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:29.967636 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:29.967590 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:38:36.442628 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:36.442590 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:39.942535 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:39.942485 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:39.967777 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:39.967718 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:38:46.449899 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:46.449311 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:49.942229 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:49.942165 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:49.968574 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:49.968530 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:38:56.443006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:56.442963 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:38:59.942937 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:59.942845 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 24 21:38:59.968546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:38:59.968497 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:06.442726 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:06.442685 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:09.957907 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:09.957872 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:39:09.967787 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:09.967737 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:09.970012 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:09.969989 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:39:16.443028 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:16.442912 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:19.967719 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:19.967673 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:22.446814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:22.446770 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:29.967637 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:29.967583 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:32.446787 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:32.446748 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:39.968442 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:39.968380 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:42.447078 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:42.447033 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:49.967943 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:49.967889 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 24 21:39:52.446812 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:52.446775 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:39:59.978451 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:59.978410 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:39:59.986538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:39:59.986509 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:40:02.446855 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:02.446814 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:08.133976 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:08.133938 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:40:08.134546 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:08.134241 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" containerID="cri-o://809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2" gracePeriod=30 Apr 24 21:40:08.136275 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:08.136249 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:40:08.136634 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:08.136602 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" containerID="cri-o://459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" gracePeriod=30 Apr 24 21:40:12.446478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:12.446434 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:22.446957 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:22.446886 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:26.673121 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673080 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:40:26.673672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673587 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" Apr 24 21:40:26.673672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673606 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" Apr 24 21:40:26.673672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673626 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="storage-initializer" Apr 24 21:40:26.673672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673634 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="storage-initializer" Apr 24 21:40:26.673888 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.673710 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="320d920b-8b88-4531-9801-47ba810f9086" containerName="main" Apr 24 21:40:26.677177 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.677154 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:40:26.677367 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.677347 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.680536 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.680517 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-q5gwr\"" Apr 24 21:40:26.680667 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.680618 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:40:26.681228 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.681211 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.693001 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.692393 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:40:26.695226 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.695204 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:40:26.812906 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.812862 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.812917 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.812982 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813020 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813088 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813119 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqc2f\" (UniqueName: \"kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813139 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5qs\" (UniqueName: \"kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813195 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813214 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813229 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813271 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813317 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813336 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.813528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.813361 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914096 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914143 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914160 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914195 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914247 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914272 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914314 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914358 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914413 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914400 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914432 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914462 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914488 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914517 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqc2f\" (UniqueName: \"kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914545 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5qs\" (UniqueName: \"kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914616 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914622 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.914761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914654 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.915237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914870 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.915237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.914938 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.915237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.915036 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.915237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.915123 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.915237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.915218 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.916861 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.916830 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.917006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.916898 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.917006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.916972 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.917129 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.916995 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.924705 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.924642 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqc2f\" (UniqueName: \"kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:26.924839 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.924733 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5qs\" (UniqueName: \"kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs\") pod \"custom-route-timeout-pd-test-kserve-585b44dc45-r4v84\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.988380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.988340 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:26.997181 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:26.997149 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:40:27.150186 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.150146 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:40:27.154173 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:40:27.154140 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549eef45_c8c3_4375_b49a_9383c8d8525d.slice/crio-99fb4a100354adbb381ebc061bf9e74a06c0a26580c4fad85bf40972da8c27df WatchSource:0}: Error finding container 99fb4a100354adbb381ebc061bf9e74a06c0a26580c4fad85bf40972da8c27df: Status 404 returned error can't find the container with id 99fb4a100354adbb381ebc061bf9e74a06c0a26580c4fad85bf40972da8c27df Apr 24 21:40:27.156193 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.156176 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:40:27.169089 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:40:27.169064 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b81b123_aae5_4625_97b5_b21d96f905a4.slice/crio-de82765a5715005bcc058277a61f6b1063310222a19b29b849e3f03f4e0ba70b WatchSource:0}: Error finding container de82765a5715005bcc058277a61f6b1063310222a19b29b849e3f03f4e0ba70b: Status 404 returned error can't find the container with id de82765a5715005bcc058277a61f6b1063310222a19b29b849e3f03f4e0ba70b Apr 24 21:40:27.169320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.169298 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:40:27.487671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.487566 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerStarted","Data":"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d"} Apr 24 21:40:27.487671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.487610 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerStarted","Data":"99fb4a100354adbb381ebc061bf9e74a06c0a26580c4fad85bf40972da8c27df"} Apr 24 21:40:27.487901 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.487728 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:27.489071 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.489037 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" event={"ID":"0b81b123-aae5-4625-97b5-b21d96f905a4","Type":"ContainerStarted","Data":"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1"} Apr 24 21:40:27.489071 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:27.489069 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" event={"ID":"0b81b123-aae5-4625-97b5-b21d96f905a4","Type":"ContainerStarted","Data":"de82765a5715005bcc058277a61f6b1063310222a19b29b849e3f03f4e0ba70b"} Apr 24 21:40:28.496709 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:28.496666 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerStarted","Data":"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64"} Apr 24 21:40:32.447026 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:32.446912 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:32.514802 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:32.514764 2560 generic.go:358] "Generic (PLEG): container finished" podID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerID="15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64" exitCode=0 Apr 24 21:40:32.515040 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:32.514840 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerDied","Data":"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64"} Apr 24 21:40:33.521018 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:33.520965 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerStarted","Data":"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc"} Apr 24 21:40:33.548132 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:33.548055 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podStartSLOduration=7.548035922 podStartE2EDuration="7.548035922s" podCreationTimestamp="2026-04-24 21:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:33.546399464 +0000 UTC m=+1421.605187136" watchObservedRunningTime="2026-04-24 21:40:33.548035922 +0000 UTC m=+1421.606823591" Apr 24 21:40:36.989114 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:36.989072 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:36.989525 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:36.989245 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:36.990649 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:36.990618 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:40:37.549917 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:37.549881 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:40:38.137416 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.137364 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="llm-d-routing-sidecar" containerID="cri-o://d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" gracePeriod=2 Apr 24 21:40:38.442971 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.442942 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:40:38.446204 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.446180 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w_63c53ce2-2559-4384-9433-c0292f470550/main/0.log" Apr 24 21:40:38.446887 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.446869 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:40:38.525611 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525570 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.525795 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525630 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.525795 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525653 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.525795 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525698 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.525795 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525726 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.526028 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.525901 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.526185 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.526165 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.528088 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528061 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trl74\" (UniqueName: \"kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.528088 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.526317 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home" (OuterVolumeSpecName: "home") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.528305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.526321 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home" (OuterVolumeSpecName: "home") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.528305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.527961 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:38.528305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528110 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.528305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528170 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvt2\" (UniqueName: \"kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.528305 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528211 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.528586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528332 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm" (OuterVolumeSpecName: "dshm") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.528586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528506 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.528586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528522 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.528586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528534 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.528586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528547 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.529000 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.528946 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:38.530408 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.530379 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm" (OuterVolumeSpecName: "dshm") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.530571 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.530547 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2" (OuterVolumeSpecName: "kube-api-access-fmvt2") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "kube-api-access-fmvt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:38.530756 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.530733 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74" (OuterVolumeSpecName: "kube-api-access-trl74") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "kube-api-access-trl74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:38.539680 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.539638 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.543048 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543023 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w_63c53ce2-2559-4384-9433-c0292f470550/main/0.log" Apr 24 21:40:38.543736 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543707 2560 generic.go:358] "Generic (PLEG): container finished" podID="63c53ce2-2559-4384-9433-c0292f470550" containerID="459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" exitCode=137 Apr 24 21:40:38.543736 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543735 2560 generic.go:358] "Generic (PLEG): container finished" podID="63c53ce2-2559-4384-9433-c0292f470550" containerID="d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" exitCode=0 Apr 24 21:40:38.543876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543804 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" Apr 24 21:40:38.543938 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543799 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerDied","Data":"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b"} Apr 24 21:40:38.544006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543943 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerDied","Data":"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2"} Apr 24 21:40:38.544006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543965 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w" event={"ID":"63c53ce2-2559-4384-9433-c0292f470550","Type":"ContainerDied","Data":"7402a666dea25d8cb1e0a0db541b46e6b5eac419dd55099cd6b42136ce278410"} Apr 24 21:40:38.544006 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.543985 2560 scope.go:117] "RemoveContainer" containerID="459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" Apr 24 21:40:38.545625 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.545598 2560 generic.go:358] "Generic (PLEG): container finished" podID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerID="809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2" exitCode=137 Apr 24 21:40:38.545780 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.545661 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" Apr 24 21:40:38.545780 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.545678 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerDied","Data":"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2"} Apr 24 21:40:38.545780 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.545706 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7" event={"ID":"6cda0b52-5413-49e9-a85e-3f7ab69677c0","Type":"ContainerDied","Data":"95cf6bf47f1d81de0ba91b07734f93d7a28ef68818452b4daf021c3201b1e74a"} Apr 24 21:40:38.551157 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.551122 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.554708 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.554690 2560 scope.go:117] "RemoveContainer" containerID="e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6" Apr 24 21:40:38.598440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.598380 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.621461 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.621436 2560 scope.go:117] "RemoveContainer" containerID="d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" Apr 24 21:40:38.628904 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.628876 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.629060 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.628912 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache\") pod \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\" (UID: \"6cda0b52-5413-49e9-a85e-3f7ab69677c0\") " Apr 24 21:40:38.629060 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.628964 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location\") pod \"63c53ce2-2559-4384-9433-c0292f470550\" (UID: \"63c53ce2-2559-4384-9433-c0292f470550\") " Apr 24 21:40:38.629193 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629164 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache" (OuterVolumeSpecName: "model-cache") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629271 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache" (OuterVolumeSpecName: "model-cache") pod "6cda0b52-5413-49e9-a85e-3f7ab69677c0" (UID: "6cda0b52-5413-49e9-a85e-3f7ab69677c0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629334 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmvt2\" (UniqueName: \"kubernetes.io/projected/63c53ce2-2559-4384-9433-c0292f470550-kube-api-access-fmvt2\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629357 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629375 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629389 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629401 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63c53ce2-2559-4384-9433-c0292f470550-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629418 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629431 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trl74\" (UniqueName: \"kubernetes.io/projected/6cda0b52-5413-49e9-a85e-3f7ab69677c0-kube-api-access-trl74\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.629523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.629443 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.631646 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.631621 2560 scope.go:117] "RemoveContainer" containerID="459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" Apr 24 21:40:38.632049 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:40:38.632028 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b\": container with ID starting with 459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b not found: ID does not exist" containerID="459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" Apr 24 21:40:38.632154 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632058 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b"} err="failed to get container status \"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b\": rpc error: code = NotFound desc = could not find container \"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b\": container with ID starting with 459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b not found: ID does not exist" Apr 24 21:40:38.632154 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632081 2560 scope.go:117] "RemoveContainer" containerID="e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6" Apr 24 21:40:38.632393 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:40:38.632368 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6\": container with ID starting with e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6 not found: ID does not exist" containerID="e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6" Apr 24 21:40:38.632459 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632406 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6"} err="failed to get container status \"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6\": rpc error: code = NotFound desc = could not find container \"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6\": container with ID starting with e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6 not found: ID does not exist" Apr 24 21:40:38.632459 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632432 2560 scope.go:117] "RemoveContainer" containerID="d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" Apr 24 21:40:38.632742 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:40:38.632717 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2\": container with ID starting with d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2 not found: ID does not exist" containerID="d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" Apr 24 21:40:38.632832 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632750 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2"} err="failed to get container status \"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2\": rpc error: code = NotFound desc = could not find container \"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2\": container with ID starting with d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2 not found: ID does not exist" Apr 24 21:40:38.632832 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.632774 2560 scope.go:117] "RemoveContainer" containerID="459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b" Apr 24 21:40:38.633073 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633046 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b"} err="failed to get container status \"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b\": rpc error: code = NotFound desc = could not find container \"459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b\": container with ID starting with 459f2e76d7fd1b9c2fda234701cda8604839b1111fe78d3b7b9a56cba226495b not found: ID does not exist" Apr 24 21:40:38.633159 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633075 2560 scope.go:117] "RemoveContainer" containerID="e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6" Apr 24 21:40:38.633330 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633310 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6"} err="failed to get container status \"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6\": rpc error: code = NotFound desc = could not find container \"e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6\": container with ID starting with e6db3f8b936f1c1a779cc03862d4f356c50c064265ec1a783ed403e5e3cae7a6 not found: ID does not exist" Apr 24 21:40:38.633390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633331 2560 scope.go:117] "RemoveContainer" containerID="d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2" Apr 24 21:40:38.633574 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633548 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2"} err="failed to get container status \"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2\": rpc error: code = NotFound desc = could not find container \"d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2\": container with ID starting with d93110541c6a9061c1375158dff10a0ec3ec852824c4b023cc7758b7d0f39ec2 not found: ID does not exist" Apr 24 21:40:38.633658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.633574 2560 scope.go:117] "RemoveContainer" containerID="809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2" Apr 24 21:40:38.641773 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.641752 2560 scope.go:117] "RemoveContainer" containerID="fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802" Apr 24 21:40:38.688656 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.688551 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "63c53ce2-2559-4384-9433-c0292f470550" (UID: "63c53ce2-2559-4384-9433-c0292f470550"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:38.710841 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.710814 2560 scope.go:117] "RemoveContainer" containerID="809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2" Apr 24 21:40:38.711227 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:40:38.711193 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2\": container with ID starting with 809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2 not found: ID does not exist" containerID="809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2" Apr 24 21:40:38.711340 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.711235 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2"} err="failed to get container status \"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2\": rpc error: code = NotFound desc = could not find container \"809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2\": container with ID starting with 809eab37b2efec04d11bab99e58f8347bfcf92e0ddcbf6bcd1dd01e9478719e2 not found: ID does not exist" Apr 24 21:40:38.711340 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.711255 2560 scope.go:117] "RemoveContainer" containerID="fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802" Apr 24 21:40:38.711555 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:40:38.711538 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802\": container with ID starting with fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802 not found: ID does not exist" containerID="fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802" Apr 24 21:40:38.711596 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.711570 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802"} err="failed to get container status \"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802\": rpc error: code = NotFound desc = could not find container \"fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802\": container with ID starting with fa560f45cba553047824610a90e8111e17a9a0c4a236a5efa479d7ddac603802 not found: ID does not exist" Apr 24 21:40:38.730178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.730143 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cda0b52-5413-49e9-a85e-3f7ab69677c0-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.730178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.730179 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63c53ce2-2559-4384-9433-c0292f470550-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.870950 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.870900 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:40:38.874985 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.874951 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-66c6c4996-4s95w"] Apr 24 21:40:38.885202 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.885171 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:40:38.890834 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:38.890805 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-8c5hmz7"] Apr 24 21:40:40.443432 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:40.443391 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:40.447212 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:40.447179 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c53ce2-2559-4384-9433-c0292f470550" path="/var/lib/kubelet/pods/63c53ce2-2559-4384-9433-c0292f470550/volumes" Apr 24 21:40:40.447846 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:40.447827 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" path="/var/lib/kubelet/pods/6cda0b52-5413-49e9-a85e-3f7ab69677c0/volumes" Apr 24 21:40:46.989763 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:46.989648 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:40:50.443421 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:50.443372 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:40:56.988793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:40:56.988737 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:00.443942 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:00.443887 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:06.989605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:06.989552 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:10.443565 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:10.443525 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:16.989562 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:16.989511 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:20.443390 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:20.443345 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:26.989230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:26.989186 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:30.443378 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:30.443337 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:36.989131 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:36.989079 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:40.443355 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:40.443317 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:44.442879 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:44.442842 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:46.988718 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:46.988669 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:41:52.454416 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:52.454390 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:41:52.455605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:52.455583 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:41:52.457521 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:52.457503 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:41:52.458549 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:52.458533 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:41:54.442773 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:54.442726 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:41:56.989564 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:41:56.989517 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 24 21:42:04.443683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:04.443642 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:42:06.997830 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:06.997802 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:42:07.009639 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:07.009614 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:42:14.443488 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:14.443448 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:42:24.443079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:24.443038 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:42:34.443585 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:34.443547 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:42:44.443838 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:44.443781 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:42:53.442438 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:42:53.442398 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:03.442592 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:03.442549 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:13.442614 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:13.442570 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:23.442116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:23.442080 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:33.442311 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:33.442268 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:43.442713 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:43.442624 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:53.442805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:53.442758 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:43:57.442814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:43:57.442774 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:07.442847 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:07.442806 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:17.443440 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:17.443397 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:27.442681 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:27.442639 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:37.443051 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:37.443007 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:47.443685 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:47.443644 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:44:57.442790 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:44:57.442748 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:45:05.442538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:05.442503 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:45:15.442582 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:15.442492 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:45:16.814288 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:16.814252 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:45:16.814668 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:16.814587 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" containerID="cri-o://7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310" gracePeriod=30 Apr 24 21:45:16.814737 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:16.814644 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" containerID="cri-o://26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d" gracePeriod=30 Apr 24 21:45:16.816002 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:16.815967 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:45:17.597873 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:17.597833 2560 generic.go:358] "Generic (PLEG): container finished" podID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerID="7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310" exitCode=0 Apr 24 21:45:17.598086 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:17.597890 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerDied","Data":"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310"} Apr 24 21:45:17.724235 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:17.724190 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.38:8082/healthz\": dial tcp 10.134.0.38:8082: connect: connection refused" Apr 24 21:45:17.948224 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:17.948203 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:45:18.016765 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.016737 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.016982 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.016776 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.016982 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.016797 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.017116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.016991 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzjct\" (UniqueName: \"kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.017116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017053 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.017116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017086 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.017116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017103 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.017320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017146 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location\") pod \"34cfb37c-464d-48d7-bb12-fb57061b8e43\" (UID: \"34cfb37c-464d-48d7-bb12-fb57061b8e43\") " Apr 24 21:45:18.017380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017314 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.017380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017352 2560 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-tmp\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.017380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017370 2560 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.017751 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.017729 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.019297 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.019275 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:18.019374 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.019279 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct" (OuterVolumeSpecName: "kube-api-access-gzjct") pod "34cfb37c-464d-48d7-bb12-fb57061b8e43" (UID: "34cfb37c-464d-48d7-bb12-fb57061b8e43"). InnerVolumeSpecName "kube-api-access-gzjct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:18.118180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.118153 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.118180 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.118177 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34cfb37c-464d-48d7-bb12-fb57061b8e43-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.118325 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.118187 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzjct\" (UniqueName: \"kubernetes.io/projected/34cfb37c-464d-48d7-bb12-fb57061b8e43-kube-api-access-gzjct\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.118325 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.118195 2560 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34cfb37c-464d-48d7-bb12-fb57061b8e43-tokenizer-uds\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.603019 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.602981 2560 generic.go:358] "Generic (PLEG): container finished" podID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerID="26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d" exitCode=0 Apr 24 21:45:18.603152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.603037 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerDied","Data":"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d"} Apr 24 21:45:18.603152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.603047 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" Apr 24 21:45:18.603152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.603061 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs" event={"ID":"34cfb37c-464d-48d7-bb12-fb57061b8e43","Type":"ContainerDied","Data":"31e57d82a5bd7d1973f823f90cb7ed34616c017abccab15559accd9c69934f23"} Apr 24 21:45:18.603152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.603076 2560 scope.go:117] "RemoveContainer" containerID="26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d" Apr 24 21:45:18.611350 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.611334 2560 scope.go:117] "RemoveContainer" containerID="7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310" Apr 24 21:45:18.618781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.618763 2560 scope.go:117] "RemoveContainer" containerID="de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d" Apr 24 21:45:18.620529 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.620504 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:45:18.624309 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.624288 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccb9kbcs"] Apr 24 21:45:18.626713 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.626692 2560 scope.go:117] "RemoveContainer" containerID="26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d" Apr 24 21:45:18.626983 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:45:18.626964 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d\": container with ID starting with 26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d not found: ID does not exist" containerID="26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d" Apr 24 21:45:18.627061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.626988 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d"} err="failed to get container status \"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d\": rpc error: code = NotFound desc = could not find container \"26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d\": container with ID starting with 26a06b3fa60205c1abe98e89bc6c3be6adce773e378ea34399a6e85bcf6fcd5d not found: ID does not exist" Apr 24 21:45:18.627061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.627005 2560 scope.go:117] "RemoveContainer" containerID="7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310" Apr 24 21:45:18.627244 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:45:18.627227 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310\": container with ID starting with 7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310 not found: ID does not exist" containerID="7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310" Apr 24 21:45:18.627323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.627249 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310"} err="failed to get container status \"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310\": rpc error: code = NotFound desc = could not find container \"7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310\": container with ID starting with 7a130f511d2d7a8133274c95919466c86b66bd2ec3b77cbf74579d96f9cba310 not found: ID does not exist" Apr 24 21:45:18.627323 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.627270 2560 scope.go:117] "RemoveContainer" containerID="de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d" Apr 24 21:45:18.627477 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:45:18.627459 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d\": container with ID starting with de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d not found: ID does not exist" containerID="de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d" Apr 24 21:45:18.627522 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:18.627482 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d"} err="failed to get container status \"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d\": rpc error: code = NotFound desc = could not find container \"de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d\": container with ID starting with de3e1679cdc9f25e6515bb0b52c4064ac35c5e251e5e3885979ead453834b14d not found: ID does not exist" Apr 24 21:45:20.446401 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:20.446370 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" path="/var/lib/kubelet/pods/34cfb37c-464d-48d7-bb12-fb57061b8e43/volumes" Apr 24 21:45:36.906342 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906308 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906596 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906605 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906614 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906619 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906628 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906634 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906641 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="llm-d-routing-sidecar" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906646 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="llm-d-routing-sidecar" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906658 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906663 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906670 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906676 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="storage-initializer" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906682 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906687 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906697 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906701 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906749 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="llm-d-routing-sidecar" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906757 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="63c53ce2-2559-4384-9433-c0292f470550" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906763 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cda0b52-5413-49e9-a85e-3f7ab69677c0" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906769 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="main" Apr 24 21:45:36.906789 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.906776 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="34cfb37c-464d-48d7-bb12-fb57061b8e43" containerName="tokenizer" Apr 24 21:45:36.909904 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.909887 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.912246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.912227 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:45:36.919366 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.919342 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:45:36.962059 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962035 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962150 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962079 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962150 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962102 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962165 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962230 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962196 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq77x\" (UniqueName: \"kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962295 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962230 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:36.962295 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:36.962255 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063302 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063275 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063310 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063335 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063365 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063386 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063403 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063479 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063421 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq77x\" (UniqueName: \"kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063861 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063805 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063861 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063839 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.063986 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063868 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.064025 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.063983 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.065511 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.065491 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.065773 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.065756 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.073152 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.073126 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq77x\" (UniqueName: \"kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x\") pod \"precise-prefix-cache-test-kserve-84b86f64f-sjmf8\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.221017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.220948 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:37.348069 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.348041 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:45:37.350138 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:45:37.350112 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3bf277_0981_4611_a983_fa549c1308a0.slice/crio-0a96ea58c64bd8aba1242a581d485599988cf06303966f4fc09387d442460381 WatchSource:0}: Error finding container 0a96ea58c64bd8aba1242a581d485599988cf06303966f4fc09387d442460381: Status 404 returned error can't find the container with id 0a96ea58c64bd8aba1242a581d485599988cf06303966f4fc09387d442460381 Apr 24 21:45:37.351969 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.351949 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:37.667296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.667254 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerStarted","Data":"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83"} Apr 24 21:45:37.667296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:37.667299 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerStarted","Data":"0a96ea58c64bd8aba1242a581d485599988cf06303966f4fc09387d442460381"} Apr 24 21:45:54.922044 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:45:54.922017 2560 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3bf277_0981_4611_a983_fa549c1308a0.slice/crio-951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:45:55.734816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:55.734782 2560 generic.go:358] "Generic (PLEG): container finished" podID="2c3bf277-0981-4611-a983-fa549c1308a0" containerID="951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83" exitCode=0 Apr 24 21:45:55.735016 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:55.734831 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerDied","Data":"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83"} Apr 24 21:45:56.739433 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:56.739397 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerStarted","Data":"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c"} Apr 24 21:45:56.761579 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:56.761526 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" podStartSLOduration=20.761508785 podStartE2EDuration="20.761508785s" podCreationTimestamp="2026-04-24 21:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:56.759319354 +0000 UTC m=+1744.818107026" watchObservedRunningTime="2026-04-24 21:45:56.761508785 +0000 UTC m=+1744.820296458" Apr 24 21:45:57.221761 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:57.221734 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:57.221919 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:57.221773 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:57.233940 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:57.233899 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:45:57.754318 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:45:57.754294 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:46:09.681153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:09.681123 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:46:09.681567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:09.681386 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="main" containerID="cri-o://b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c" gracePeriod=30 Apr 24 21:46:09.924227 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:09.924204 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:46:10.031109 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031080 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031123 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031148 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq77x\" (UniqueName: \"kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031176 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031210 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031240 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031289 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031287 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs\") pod \"2c3bf277-0981-4611-a983-fa549c1308a0\" (UID: \"2c3bf277-0981-4611-a983-fa549c1308a0\") " Apr 24 21:46:10.031615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031375 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:10.031615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031406 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache" (OuterVolumeSpecName: "model-cache") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:10.031615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031550 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.031615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031568 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.031615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.031561 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home" (OuterVolumeSpecName: "home") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:10.033310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.033278 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm" (OuterVolumeSpecName: "dshm") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:10.033434 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.033376 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:10.033434 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.033396 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x" (OuterVolumeSpecName: "kube-api-access-bq77x") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "kube-api-access-bq77x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:10.095744 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.095710 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c3bf277-0981-4611-a983-fa549c1308a0" (UID: "2c3bf277-0981-4611-a983-fa549c1308a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:10.132980 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.132957 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3bf277-0981-4611-a983-fa549c1308a0-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.132980 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.132980 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq77x\" (UniqueName: \"kubernetes.io/projected/2c3bf277-0981-4611-a983-fa549c1308a0-kube-api-access-bq77x\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.133129 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.132990 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.133129 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.133000 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.133129 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.133008 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3bf277-0981-4611-a983-fa549c1308a0-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:10.791002 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.790965 2560 generic.go:358] "Generic (PLEG): container finished" podID="2c3bf277-0981-4611-a983-fa549c1308a0" containerID="b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c" exitCode=0 Apr 24 21:46:10.791387 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.791081 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerDied","Data":"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c"} Apr 24 21:46:10.791387 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.791106 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" event={"ID":"2c3bf277-0981-4611-a983-fa549c1308a0","Type":"ContainerDied","Data":"0a96ea58c64bd8aba1242a581d485599988cf06303966f4fc09387d442460381"} Apr 24 21:46:10.791387 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.791122 2560 scope.go:117] "RemoveContainer" containerID="b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c" Apr 24 21:46:10.791387 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.791084 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8" Apr 24 21:46:10.799684 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.799671 2560 scope.go:117] "RemoveContainer" containerID="951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83" Apr 24 21:46:10.809281 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.809243 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:46:10.811707 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.811684 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-84b86f64f-sjmf8"] Apr 24 21:46:10.871450 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.871431 2560 scope.go:117] "RemoveContainer" containerID="b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c" Apr 24 21:46:10.871717 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:10.871698 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c\": container with ID starting with b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c not found: ID does not exist" containerID="b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c" Apr 24 21:46:10.871785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.871725 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c"} err="failed to get container status \"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c\": rpc error: code = NotFound desc = could not find container \"b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c\": container with ID starting with b0c4f604c418bb31d0f6cc91f90808f5c81edf226c02fffdde6418251551420c not found: ID does not exist" Apr 24 21:46:10.871785 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.871744 2560 scope.go:117] "RemoveContainer" containerID="951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83" Apr 24 21:46:10.871995 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:10.871975 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83\": container with ID starting with 951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83 not found: ID does not exist" containerID="951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83" Apr 24 21:46:10.872047 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:10.872001 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83"} err="failed to get container status \"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83\": rpc error: code = NotFound desc = could not find container \"951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83\": container with ID starting with 951b4e3e2985d1302c9511c1181e242215cc50d8cba2933e930e75de247d7e83 not found: ID does not exist" Apr 24 21:46:12.446183 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:12.446153 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" path="/var/lib/kubelet/pods/2c3bf277-0981-4611-a983-fa549c1308a0/volumes" Apr 24 21:46:13.174653 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.174617 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:13.175119 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.175103 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="main" Apr 24 21:46:13.175178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.175123 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="main" Apr 24 21:46:13.175178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.175141 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="storage-initializer" Apr 24 21:46:13.175178 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.175151 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="storage-initializer" Apr 24 21:46:13.175281 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.175245 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c3bf277-0981-4611-a983-fa549c1308a0" containerName="main" Apr 24 21:46:13.180484 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.180465 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.183119 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.183096 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 24 21:46:13.186478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.186457 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:13.258978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.258943 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259134 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.258985 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259134 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.259034 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259134 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.259074 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259253 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.259169 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259253 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.259204 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.259253 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.259230 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lddg\" (UniqueName: \"kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.359978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.359944 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.359978 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.359980 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360012 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lddg\" (UniqueName: \"kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360036 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360058 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360091 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360192 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360134 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360497 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360473 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360592 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360562 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360656 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360605 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.360707 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.360655 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.362273 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.362242 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.362383 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.362364 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.369489 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.369464 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lddg\" (UniqueName: \"kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg\") pod \"conv-test-lora-crit-kserve-748bc754b7-hv948\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.492882 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.492787 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:13.625472 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.625449 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:13.626914 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:46:13.626881 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015bc461_d087_406c_8752_38a218624161.slice/crio-ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9 WatchSource:0}: Error finding container ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9: Status 404 returned error can't find the container with id ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9 Apr 24 21:46:13.804676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.804643 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" event={"ID":"015bc461-d087-406c-8752-38a218624161","Type":"ContainerStarted","Data":"c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03"} Apr 24 21:46:13.804676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:13.804682 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" event={"ID":"015bc461-d087-406c-8752-38a218624161","Type":"ContainerStarted","Data":"ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9"} Apr 24 21:46:14.809708 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:14.809683 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/0.log" Apr 24 21:46:14.810097 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:14.809720 2560 generic.go:358] "Generic (PLEG): container finished" podID="015bc461-d087-406c-8752-38a218624161" containerID="c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03" exitCode=1 Apr 24 21:46:14.810097 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:14.809777 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" event={"ID":"015bc461-d087-406c-8752-38a218624161","Type":"ContainerDied","Data":"c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03"} Apr 24 21:46:15.815494 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.815468 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/1.log" Apr 24 21:46:15.815887 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.815841 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/0.log" Apr 24 21:46:15.815887 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.815872 2560 generic.go:358] "Generic (PLEG): container finished" podID="015bc461-d087-406c-8752-38a218624161" containerID="f80be8905458f85e3383bb512d95bed5668c8b1685c1b21d57086792f40e2600" exitCode=1 Apr 24 21:46:15.815997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.815904 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" event={"ID":"015bc461-d087-406c-8752-38a218624161","Type":"ContainerDied","Data":"f80be8905458f85e3383bb512d95bed5668c8b1685c1b21d57086792f40e2600"} Apr 24 21:46:15.815997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.815976 2560 scope.go:117] "RemoveContainer" containerID="c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03" Apr 24 21:46:15.816296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:15.816279 2560 scope.go:117] "RemoveContainer" containerID="c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03" Apr 24 21:46:15.826436 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:15.826408 2560 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-748bc754b7-hv948_kserve-ci-e2e-test_015bc461-d087-406c-8752-38a218624161_0 in pod sandbox ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9 from index: no such id: 'c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03'" containerID="c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03" Apr 24 21:46:15.826508 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:15.826453 2560 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-748bc754b7-hv948_kserve-ci-e2e-test_015bc461-d087-406c-8752-38a218624161_0 in pod sandbox ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9 from index: no such id: 'c27b80172de612ad0a2b328d61ddfdcaa0788b64792d9804a083f62683d92d03'; Skipping pod \"conv-test-lora-crit-kserve-748bc754b7-hv948_kserve-ci-e2e-test(015bc461-d087-406c-8752-38a218624161)\"" logger="UnhandledError" Apr 24 21:46:15.827750 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:15.827733 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-748bc754b7-hv948_kserve-ci-e2e-test(015bc461-d087-406c-8752-38a218624161)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" podUID="015bc461-d087-406c-8752-38a218624161" Apr 24 21:46:16.821035 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:16.821007 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/1.log" Apr 24 21:46:16.821526 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:46:16.821504 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-748bc754b7-hv948_kserve-ci-e2e-test(015bc461-d087-406c-8752-38a218624161)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" podUID="015bc461-d087-406c-8752-38a218624161" Apr 24 21:46:22.177385 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.177353 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:46:22.180563 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.180545 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.186449 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.186426 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:46:22.197050 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.197025 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:46:22.225663 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225636 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225667 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225690 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225711 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5wk\" (UniqueName: \"kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225771 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225835 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.225876 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.225863 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.326936 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.326900 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327099 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.326952 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327099 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.326988 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327099 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327016 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5wk\" (UniqueName: \"kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327099 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327047 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327334 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327306 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327339 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327357 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327491 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327357 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327491 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327425 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.327610 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.327592 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.329351 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.329326 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.329450 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.329370 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.334837 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.334819 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5wk\" (UniqueName: \"kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk\") pod \"stop-feature-test-kserve-bbc488f8c-xwdgj\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.489830 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.489751 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:46:22.616285 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.616259 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:46:22.618143 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:46:22.618115 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a791291_5ef7_475a_b318_15889388f7b8.slice/crio-ceb8a8b688f59a411cedb77eafbea10178775684f2ba9eaddf12afaa7dc0b44a WatchSource:0}: Error finding container ceb8a8b688f59a411cedb77eafbea10178775684f2ba9eaddf12afaa7dc0b44a: Status 404 returned error can't find the container with id ceb8a8b688f59a411cedb77eafbea10178775684f2ba9eaddf12afaa7dc0b44a Apr 24 21:46:22.842173 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.842136 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerStarted","Data":"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9"} Apr 24 21:46:22.842380 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:22.842180 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerStarted","Data":"ceb8a8b688f59a411cedb77eafbea10178775684f2ba9eaddf12afaa7dc0b44a"} Apr 24 21:46:25.869660 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:25.869628 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:25.994710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:25.994688 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/1.log" Apr 24 21:46:25.994822 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:25.994750 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:26.061394 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061359 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061419 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061454 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061515 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061478 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061657 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061533 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061657 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061562 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061764 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061584 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lddg\" (UniqueName: \"kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg\") pod \"015bc461-d087-406c-8752-38a218624161\" (UID: \"015bc461-d087-406c-8752-38a218624161\") " Apr 24 21:46:26.061764 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061704 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache" (OuterVolumeSpecName: "model-cache") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:26.061764 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061713 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home" (OuterVolumeSpecName: "home") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:26.061954 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061782 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:26.061954 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.061905 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:26.062039 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.062011 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.062039 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.062032 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.062143 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.062045 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.062143 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.062059 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.063412 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.063386 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:26.063517 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.063502 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm" (OuterVolumeSpecName: "dshm") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:26.063667 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.063655 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg" (OuterVolumeSpecName: "kube-api-access-9lddg") pod "015bc461-d087-406c-8752-38a218624161" (UID: "015bc461-d087-406c-8752-38a218624161"). InnerVolumeSpecName "kube-api-access-9lddg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:26.163402 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.163341 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/015bc461-d087-406c-8752-38a218624161-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.163402 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.163366 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9lddg\" (UniqueName: \"kubernetes.io/projected/015bc461-d087-406c-8752-38a218624161-kube-api-access-9lddg\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.163402 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.163377 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/015bc461-d087-406c-8752-38a218624161-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:46:26.859657 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.859631 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-748bc754b7-hv948_015bc461-d087-406c-8752-38a218624161/storage-initializer/1.log" Apr 24 21:46:26.859808 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.859715 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" event={"ID":"015bc461-d087-406c-8752-38a218624161","Type":"ContainerDied","Data":"ede729927009895baa90caadce988f0f37cdf8609652561f6b7c2bce859dcfb9"} Apr 24 21:46:26.859808 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.859745 2560 scope.go:117] "RemoveContainer" containerID="f80be8905458f85e3383bb512d95bed5668c8b1685c1b21d57086792f40e2600" Apr 24 21:46:26.859808 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.859763 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948" Apr 24 21:46:26.896017 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.895991 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:26.900760 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:26.900737 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-748bc754b7-hv948"] Apr 24 21:46:28.445873 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:28.445840 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015bc461-d087-406c-8752-38a218624161" path="/var/lib/kubelet/pods/015bc461-d087-406c-8752-38a218624161/volumes" Apr 24 21:46:52.482041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:52.482008 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:46:52.483035 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:52.483012 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:46:52.485729 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:52.485713 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:46:52.486552 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:46:52.486536 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:47:49.146392 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:49.146356 2560 generic.go:358] "Generic (PLEG): container finished" podID="1a791291-5ef7-475a-b318-15889388f7b8" containerID="7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9" exitCode=0 Apr 24 21:47:49.146805 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:49.146432 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerDied","Data":"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9"} Apr 24 21:47:50.152199 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:50.152155 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerStarted","Data":"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837"} Apr 24 21:47:50.174310 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:50.174254 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podStartSLOduration=88.174237159 podStartE2EDuration="1m28.174237159s" podCreationTimestamp="2026-04-24 21:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:50.1720687 +0000 UTC m=+1858.230856372" watchObservedRunningTime="2026-04-24 21:47:50.174237159 +0000 UTC m=+1858.233024832" Apr 24 21:47:52.490799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:52.490745 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:47:52.490799 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:52.490795 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:47:52.492185 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:47:52.492148 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:02.490555 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:02.490510 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:12.491065 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:12.490970 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:22.490396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:22.490351 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:32.491024 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:32.490986 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:42.491001 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:42.490954 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:48:52.490751 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:48:52.490705 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:49:02.490713 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:02.490665 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:49:12.490569 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:12.490519 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 24 21:49:22.501003 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:22.500973 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:49:22.508950 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:22.508916 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:49:23.667467 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:23.667438 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:49:23.686315 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:23.686294 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:23.686435 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:23.686359 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:24.186341551 +0000 UTC m=+1952.245129201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:24.190585 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:24.190546 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:24.190770 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:24.190616 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:25.190602197 +0000 UTC m=+1953.249389847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:24.500619 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:24.500530 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" containerID="cri-o://e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837" gracePeriod=30 Apr 24 21:49:25.197635 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:25.197601 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:25.198032 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:25.197691 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:27.197676784 +0000 UTC m=+1955.256464434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:27.216714 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:27.216676 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:27.217111 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:27.216747 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:31.216731317 +0000 UTC m=+1959.275518972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:31.253103 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:31.253066 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:31.253492 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:31.253137 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:39.253121727 +0000 UTC m=+1967.311909376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:39.326618 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:39.326583 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:39.327034 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:39.326658 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs podName:1a791291-5ef7-475a-b318-15889388f7b8 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:55.326640416 +0000 UTC m=+1983.385428085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs") pod "stop-feature-test-kserve-bbc488f8c-xwdgj" (UID: "1a791291-5ef7-475a-b318-15889388f7b8") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 24 21:49:39.992990 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.992954 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:49:39.993436 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993419 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.993496 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993442 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.993553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993541 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.993593 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993558 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.993647 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993635 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.993698 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.993649 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="015bc461-d087-406c-8752-38a218624161" containerName="storage-initializer" Apr 24 21:49:39.996878 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:39.996855 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.008524 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.008498 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:49:40.033232 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033202 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zwnt\" (UniqueName: \"kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033348 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033257 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033348 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033275 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033348 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033334 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033473 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033383 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033473 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033403 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.033473 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.033423 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134267 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134228 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134267 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134269 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134307 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zwnt\" (UniqueName: \"kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134344 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134362 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134381 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134499 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134422 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134646 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134699 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134767 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134749 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.134889 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.134799 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.136691 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.136665 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.137035 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.137013 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.141693 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.141668 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zwnt\" (UniqueName: \"kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt\") pod \"stop-feature-test-kserve-bbc488f8c-mzq9z\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.308505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.308466 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:40.436855 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.436814 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:49:40.438967 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:49:40.438936 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6f52ab_95d0_4f28_bdaf_9f4ae9e0291c.slice/crio-32d8f6dde14317d38ef5b73797f50a0504c028c837e1b2fb7cce089c83c10765 WatchSource:0}: Error finding container 32d8f6dde14317d38ef5b73797f50a0504c028c837e1b2fb7cce089c83c10765: Status 404 returned error can't find the container with id 32d8f6dde14317d38ef5b73797f50a0504c028c837e1b2fb7cce089c83c10765 Apr 24 21:49:40.558022 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.557978 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerStarted","Data":"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7"} Apr 24 21:49:40.558179 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:40.558026 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerStarted","Data":"32d8f6dde14317d38ef5b73797f50a0504c028c837e1b2fb7cce089c83c10765"} Apr 24 21:49:48.591821 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:48.591785 2560 generic.go:358] "Generic (PLEG): container finished" podID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerID="23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7" exitCode=0 Apr 24 21:49:48.592272 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:48.591865 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerDied","Data":"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7"} Apr 24 21:49:49.598119 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:49.598085 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerStarted","Data":"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484"} Apr 24 21:49:49.619740 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:49.619681 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podStartSLOduration=10.619661771 podStartE2EDuration="10.619661771s" podCreationTimestamp="2026-04-24 21:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:49.616628143 +0000 UTC m=+1977.675415815" watchObservedRunningTime="2026-04-24 21:49:49.619661771 +0000 UTC m=+1977.678449444" Apr 24 21:49:50.309521 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:50.309478 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:50.309709 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:50.309627 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:49:50.311168 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:50.311125 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:49:54.779641 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.779617 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-bbc488f8c-xwdgj_1a791291-5ef7-475a-b318-15889388f7b8/main/0.log" Apr 24 21:49:54.780030 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.780014 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:49:54.873822 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.873784 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874004 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.873856 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874004 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.873908 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5wk\" (UniqueName: \"kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874004 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.873968 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874017 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874045 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874059 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache" (OuterVolumeSpecName: "model-cache") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:54.874188 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874079 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location\") pod \"1a791291-5ef7-475a-b318-15889388f7b8\" (UID: \"1a791291-5ef7-475a-b318-15889388f7b8\") " Apr 24 21:49:54.874396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874378 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.874916 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.874882 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home" (OuterVolumeSpecName: "home") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:54.876630 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.876586 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:54.876756 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.876707 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk" (OuterVolumeSpecName: "kube-api-access-6x5wk") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "kube-api-access-6x5wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:54.876821 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.876787 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm" (OuterVolumeSpecName: "dshm") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:54.885442 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.885415 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:54.924298 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.924258 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1a791291-5ef7-475a-b318-15889388f7b8" (UID: "1a791291-5ef7-475a-b318-15889388f7b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:54.975687 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975644 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.975687 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975679 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.975956 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975717 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.975956 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975732 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a791291-5ef7-475a-b318-15889388f7b8-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.975956 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975751 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6x5wk\" (UniqueName: \"kubernetes.io/projected/1a791291-5ef7-475a-b318-15889388f7b8-kube-api-access-6x5wk\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:54.975956 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:54.975769 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a791291-5ef7-475a-b318-15889388f7b8-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:49:55.623089 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623063 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-bbc488f8c-xwdgj_1a791291-5ef7-475a-b318-15889388f7b8/main/0.log" Apr 24 21:49:55.623424 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623396 2560 generic.go:358] "Generic (PLEG): container finished" podID="1a791291-5ef7-475a-b318-15889388f7b8" containerID="e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837" exitCode=137 Apr 24 21:49:55.623529 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623478 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerDied","Data":"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837"} Apr 24 21:49:55.623529 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623517 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" event={"ID":"1a791291-5ef7-475a-b318-15889388f7b8","Type":"ContainerDied","Data":"ceb8a8b688f59a411cedb77eafbea10178775684f2ba9eaddf12afaa7dc0b44a"} Apr 24 21:49:55.623716 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623527 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj" Apr 24 21:49:55.623716 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.623531 2560 scope.go:117] "RemoveContainer" containerID="e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837" Apr 24 21:49:55.632176 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.632157 2560 scope.go:117] "RemoveContainer" containerID="7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9" Apr 24 21:49:55.649053 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.649028 2560 scope.go:117] "RemoveContainer" containerID="e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837" Apr 24 21:49:55.649358 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:55.649335 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837\": container with ID starting with e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837 not found: ID does not exist" containerID="e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837" Apr 24 21:49:55.649514 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.649488 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837"} err="failed to get container status \"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837\": rpc error: code = NotFound desc = could not find container \"e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837\": container with ID starting with e4cfd9918afc3e04f35e04203c8fc7223df8b5a95494bed66f0eb0fd05c5c837 not found: ID does not exist" Apr 24 21:49:55.649628 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.649607 2560 scope.go:117] "RemoveContainer" containerID="7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9" Apr 24 21:49:55.650105 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:49:55.650062 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9\": container with ID starting with 7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9 not found: ID does not exist" containerID="7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9" Apr 24 21:49:55.650206 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.650110 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9"} err="failed to get container status \"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9\": rpc error: code = NotFound desc = could not find container \"7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9\": container with ID starting with 7df52fb13562d5f540f27215b1bf5ee50915b3bf92b59c1167208568ab4f9bd9 not found: ID does not exist" Apr 24 21:49:55.650807 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.650786 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:49:55.653682 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:55.653662 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-xwdgj"] Apr 24 21:49:56.445578 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:49:56.445543 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a791291-5ef7-475a-b318-15889388f7b8" path="/var/lib/kubelet/pods/1a791291-5ef7-475a-b318-15889388f7b8/volumes" Apr 24 21:50:00.309708 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:00.309665 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:50:10.309133 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:10.309087 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:50:20.309741 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:20.309683 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:50:30.309967 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:30.309881 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:50:40.309718 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:40.309673 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:50:50.309219 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:50:50.309177 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:51:00.309653 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:00.309604 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:51:10.309337 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:10.309290 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:51:20.309611 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:20.309502 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 24 21:51:30.318622 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:30.318588 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:51:30.326365 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:30.326337 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:51:31.501746 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:31.501711 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:51:31.988507 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:31.988468 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" containerID="cri-o://4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484" gracePeriod=30 Apr 24 21:51:52.511031 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:52.511000 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:51:52.511997 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:52.511973 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:51:52.514996 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:52.514976 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:51:52.515905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:51:52.515889 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:52:02.265226 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.265200 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-bbc488f8c-mzq9z_7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c/main/0.log" Apr 24 21:52:02.265601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.265580 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:52:02.366038 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366002 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366078 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366116 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366143 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366185 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366237 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366226 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zwnt\" (UniqueName: \"kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366559 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366260 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache\") pod \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\" (UID: \"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c\") " Apr 24 21:52:02.366678 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366639 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache" (OuterVolumeSpecName: "model-cache") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:02.366994 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.366971 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home" (OuterVolumeSpecName: "home") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:02.368281 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.368248 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:02.368587 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.368566 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm" (OuterVolumeSpecName: "dshm") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:02.368672 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.368646 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt" (OuterVolumeSpecName: "kube-api-access-6zwnt") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "kube-api-access-6zwnt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:02.379465 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.379439 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:02.425126 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.425068 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" (UID: "7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:02.467521 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467489 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zwnt\" (UniqueName: \"kubernetes.io/projected/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kube-api-access-6zwnt\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467521 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467516 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467531 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467544 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467555 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467566 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:02.467692 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:02.467578 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:52:03.108081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108051 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-bbc488f8c-mzq9z_7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c/main/0.log" Apr 24 21:52:03.108407 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108385 2560 generic.go:358] "Generic (PLEG): container finished" podID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerID="4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484" exitCode=137 Apr 24 21:52:03.108502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108468 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerDied","Data":"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484"} Apr 24 21:52:03.108502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108482 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" Apr 24 21:52:03.108502 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108498 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z" event={"ID":"7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c","Type":"ContainerDied","Data":"32d8f6dde14317d38ef5b73797f50a0504c028c837e1b2fb7cce089c83c10765"} Apr 24 21:52:03.108665 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.108514 2560 scope.go:117] "RemoveContainer" containerID="4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484" Apr 24 21:52:03.116913 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.116889 2560 scope.go:117] "RemoveContainer" containerID="23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7" Apr 24 21:52:03.129240 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.129205 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:52:03.132492 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.132463 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-bbc488f8c-mzq9z"] Apr 24 21:52:03.184874 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.184840 2560 scope.go:117] "RemoveContainer" containerID="4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484" Apr 24 21:52:03.185247 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:52:03.185220 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484\": container with ID starting with 4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484 not found: ID does not exist" containerID="4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484" Apr 24 21:52:03.185333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.185256 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484"} err="failed to get container status \"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484\": rpc error: code = NotFound desc = could not find container \"4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484\": container with ID starting with 4151fefe8f7b9664816130f243e448ff4bb1658a7d7d8372d7645e57c4116484 not found: ID does not exist" Apr 24 21:52:03.185333 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.185276 2560 scope.go:117] "RemoveContainer" containerID="23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7" Apr 24 21:52:03.185539 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:52:03.185521 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7\": container with ID starting with 23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7 not found: ID does not exist" containerID="23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7" Apr 24 21:52:03.185589 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:03.185545 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7"} err="failed to get container status \"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7\": rpc error: code = NotFound desc = could not find container \"23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7\": container with ID starting with 23160c895b0788bea9a3ff9a30749e5a5ec88d55feb588965da27f7901a98fa7 not found: ID does not exist" Apr 24 21:52:04.446671 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:04.446632 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" path="/var/lib/kubelet/pods/7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c/volumes" Apr 24 21:52:55.577605 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577538 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577918 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577947 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577961 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="storage-initializer" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577970 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="storage-initializer" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577982 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="storage-initializer" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.577987 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="storage-initializer" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.578005 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.578011 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.578061 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a791291-5ef7-475a-b318-15889388f7b8" containerName="main" Apr 24 21:52:55.578081 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.578069 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f6f52ab-95d0-4f28-bdaf-9f4ae9e0291c" containerName="main" Apr 24 21:52:55.581315 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.581293 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.583553 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.583531 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 24 21:52:55.594422 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.594399 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:52:55.609273 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609250 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609285 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h24t\" (UniqueName: \"kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609316 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609342 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609391 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609386 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609532 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609411 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.609532 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.609438 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.709822 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.709792 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.709822 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.709827 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.709850 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.709882 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.709961 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710009 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710047 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h24t\" (UniqueName: \"kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710381 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710288 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710381 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710336 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710505 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710415 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.710567 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.710497 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.712371 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.712351 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.712484 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.712468 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.717609 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.717587 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h24t\" (UniqueName: \"kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:55.894892 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:55.894800 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:52:56.028140 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:56.028104 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:52:56.030989 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:52:56.030954 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda027bf3c_14c7_4726_adc4_7588b3b3147c.slice/crio-1575c7a57bfbda2f7b0b08feabb520588dbb79759ec3b215aa937bc49b7848f9 WatchSource:0}: Error finding container 1575c7a57bfbda2f7b0b08feabb520588dbb79759ec3b215aa937bc49b7848f9: Status 404 returned error can't find the container with id 1575c7a57bfbda2f7b0b08feabb520588dbb79759ec3b215aa937bc49b7848f9 Apr 24 21:52:56.032848 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:56.032828 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:52:56.302935 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:56.302894 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerStarted","Data":"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380"} Apr 24 21:52:56.303133 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:52:56.302952 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerStarted","Data":"1575c7a57bfbda2f7b0b08feabb520588dbb79759ec3b215aa937bc49b7848f9"} Apr 24 21:54:50.711001 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:50.710969 2560 generic.go:358] "Generic (PLEG): container finished" podID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerID="c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380" exitCode=0 Apr 24 21:54:50.711361 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:50.711042 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerDied","Data":"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380"} Apr 24 21:54:51.722360 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:51.722324 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerStarted","Data":"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c"} Apr 24 21:54:51.744447 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:51.744399 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podStartSLOduration=116.744385271 podStartE2EDuration="1m56.744385271s" podCreationTimestamp="2026-04-24 21:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:51.742553373 +0000 UTC m=+2279.801341046" watchObservedRunningTime="2026-04-24 21:54:51.744385271 +0000 UTC m=+2279.803172943" Apr 24 21:54:55.895002 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:55.894911 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:54:55.895386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:55.895018 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:54:55.896719 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:54:55.896689 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:05.895915 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:05.895868 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:11.418894 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:11.418848 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:55:11.419397 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:11.419153 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" podUID="0b81b123-aae5-4625-97b5-b21d96f905a4" containerName="storage-initializer" containerID="cri-o://88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1" gracePeriod=30 Apr 24 21:55:11.426417 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:11.426386 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:55:11.426870 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:11.426839 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" containerID="cri-o://9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" gracePeriod=30 Apr 24 21:55:15.895877 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:15.895836 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:25.896063 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:25.896022 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:35.624854 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.624815 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 21:55:35.628308 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.628292 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.631019 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.630996 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-7c2d6\"" Apr 24 21:55:35.631145 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.631076 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 21:55:35.641223 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.641202 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 21:55:35.653570 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.653547 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 21:55:35.656882 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.656864 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.675366 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.675342 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 21:55:35.707900 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.707866 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.707917 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.707964 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708061 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.708017 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.708060 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.708105 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.708218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.708130 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9czfl\" (UniqueName: \"kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.809522 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809489 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57hrl\" (UniqueName: \"kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.809683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809535 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.809683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809554 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.809683 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809648 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.809807 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809698 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9czfl\" (UniqueName: \"kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.809807 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809732 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.809914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809806 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.809914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809846 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810050 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.809961 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810050 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810003 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810050 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810019 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810061 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810092 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.810218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810123 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.810218 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810189 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.810375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810218 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810297 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.810375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.810352 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.811862 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.811837 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.812412 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.812395 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.819729 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.819700 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9czfl\" (UniqueName: \"kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl\") pod \"router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.895281 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.895194 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:35.911415 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911377 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911415 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911407 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911427 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911537 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57hrl\" (UniqueName: \"kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911586 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911675 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911628 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911700 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911764 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.911905 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.911864 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.912116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.912009 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.912116 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.912059 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.914191 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.914161 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.914306 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.914290 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.919269 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.919244 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57hrl\" (UniqueName: \"kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl\") pod \"router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:35.940093 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.940068 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:35.967935 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:35.967896 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:55:36.086836 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.086596 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 21:55:36.089843 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:55:36.089808 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod942f3445_f5e3_48c9_a901_64599b5da99a.slice/crio-917b5a6965544d345fcc0d68412e1a5dd8c06cf748673846397c4f1d82eb5265 WatchSource:0}: Error finding container 917b5a6965544d345fcc0d68412e1a5dd8c06cf748673846397c4f1d82eb5265: Status 404 returned error can't find the container with id 917b5a6965544d345fcc0d68412e1a5dd8c06cf748673846397c4f1d82eb5265 Apr 24 21:55:36.111408 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.111386 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 21:55:36.125954 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:55:36.125905 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03ba359_fc1b_4bea_90ae_91f8ffdf77c9.slice/crio-876aeae99a065f181ae7f1125ad2b841be0a150082b5299c6e977f9291546329 WatchSource:0}: Error finding container 876aeae99a065f181ae7f1125ad2b841be0a150082b5299c6e977f9291546329: Status 404 returned error can't find the container with id 876aeae99a065f181ae7f1125ad2b841be0a150082b5299c6e977f9291546329 Apr 24 21:55:36.887397 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.887357 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerStarted","Data":"f42c731a08259f2d266042d428a368a3e54819caca0aba367892338967685ba0"} Apr 24 21:55:36.887397 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.887402 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerStarted","Data":"876aeae99a065f181ae7f1125ad2b841be0a150082b5299c6e977f9291546329"} Apr 24 21:55:36.889123 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.889088 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerStarted","Data":"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc"} Apr 24 21:55:36.889123 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.889121 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerStarted","Data":"917b5a6965544d345fcc0d68412e1a5dd8c06cf748673846397c4f1d82eb5265"} Apr 24 21:55:36.889275 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:36.889230 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:37.902377 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:37.902336 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerStarted","Data":"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc"} Apr 24 21:55:41.427827 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.427737 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="llm-d-routing-sidecar" containerID="cri-o://6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" gracePeriod=2 Apr 24 21:55:41.689296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.689274 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t_0b81b123-aae5-4625-97b5-b21d96f905a4/storage-initializer/0.log" Apr 24 21:55:41.689490 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.689338 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:55:41.703068 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.703043 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-585b44dc45-r4v84_549eef45-c8c3-4375-b49a-9383c8d8525d/main/0.log" Apr 24 21:55:41.703782 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.703764 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:55:41.769219 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769188 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqc2f\" (UniqueName: \"kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769247 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769270 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769462 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769391 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769462 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769434 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769467 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769498 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.769551 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769534 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home\") pod \"0b81b123-aae5-4625-97b5-b21d96f905a4\" (UID: \"0b81b123-aae5-4625-97b5-b21d96f905a4\") " Apr 24 21:55:41.769712 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769590 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache" (OuterVolumeSpecName: "model-cache") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.769837 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769812 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.769989 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769847 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.769989 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.769824 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home" (OuterVolumeSpecName: "home") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.771586 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.771558 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:41.771679 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.771582 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm" (OuterVolumeSpecName: "dshm") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.771679 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.771631 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f" (OuterVolumeSpecName: "kube-api-access-cqc2f") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "kube-api-access-cqc2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:41.819131 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.819105 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b81b123-aae5-4625-97b5-b21d96f905a4" (UID: "0b81b123-aae5-4625-97b5-b21d96f905a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.870532 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870505 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870635 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870537 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870635 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870571 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5qs\" (UniqueName: \"kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870635 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870599 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870635 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870622 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870647 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870823 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870691 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location\") pod \"549eef45-c8c3-4375-b49a-9383c8d8525d\" (UID: \"549eef45-c8c3-4375-b49a-9383c8d8525d\") " Apr 24 21:55:41.870955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870852 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache" (OuterVolumeSpecName: "model-cache") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.870955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870913 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqc2f\" (UniqueName: \"kubernetes.io/projected/0b81b123-aae5-4625-97b5-b21d96f905a4-kube-api-access-cqc2f\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.871074 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870958 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.871074 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870973 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b81b123-aae5-4625-97b5-b21d96f905a4-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.871074 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.870988 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.871074 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.871001 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b81b123-aae5-4625-97b5-b21d96f905a4-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.871396 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.871368 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home" (OuterVolumeSpecName: "home") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.872863 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.872841 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm" (OuterVolumeSpecName: "dshm") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.873063 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.873040 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:41.873327 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.873305 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs" (OuterVolumeSpecName: "kube-api-access-pw5qs") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "kube-api-access-pw5qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:41.882120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.882099 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.921657 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.921634 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-585b44dc45-r4v84_549eef45-c8c3-4375-b49a-9383c8d8525d/main/0.log" Apr 24 21:55:41.922441 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922414 2560 generic.go:358] "Generic (PLEG): container finished" podID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerID="9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" exitCode=137 Apr 24 21:55:41.922441 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922439 2560 generic.go:358] "Generic (PLEG): container finished" podID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerID="6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" exitCode=0 Apr 24 21:55:41.922601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922455 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerDied","Data":"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc"} Apr 24 21:55:41.922601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922500 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerDied","Data":"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d"} Apr 24 21:55:41.922601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922515 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" Apr 24 21:55:41.922601 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922544 2560 scope.go:117] "RemoveContainer" containerID="9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" Apr 24 21:55:41.922781 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.922518 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84" event={"ID":"549eef45-c8c3-4375-b49a-9383c8d8525d","Type":"ContainerDied","Data":"99fb4a100354adbb381ebc061bf9e74a06c0a26580c4fad85bf40972da8c27df"} Apr 24 21:55:41.924606 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.924580 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t_0b81b123-aae5-4625-97b5-b21d96f905a4/storage-initializer/0.log" Apr 24 21:55:41.924703 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.924629 2560 generic.go:358] "Generic (PLEG): container finished" podID="0b81b123-aae5-4625-97b5-b21d96f905a4" containerID="88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1" exitCode=137 Apr 24 21:55:41.924768 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.924707 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" Apr 24 21:55:41.924768 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.924718 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" event={"ID":"0b81b123-aae5-4625-97b5-b21d96f905a4","Type":"ContainerDied","Data":"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1"} Apr 24 21:55:41.924955 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.924914 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t" event={"ID":"0b81b123-aae5-4625-97b5-b21d96f905a4","Type":"ContainerDied","Data":"de82765a5715005bcc058277a61f6b1063310222a19b29b849e3f03f4e0ba70b"} Apr 24 21:55:41.927888 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.927867 2560 generic.go:358] "Generic (PLEG): container finished" podID="942f3445-f5e3-48c9-a901-64599b5da99a" containerID="f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc" exitCode=0 Apr 24 21:55:41.927985 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.927895 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerDied","Data":"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc"} Apr 24 21:55:41.932841 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.932820 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "549eef45-c8c3-4375-b49a-9383c8d8525d" (UID: "549eef45-c8c3-4375-b49a-9383c8d8525d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:41.933461 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.933440 2560 scope.go:117] "RemoveContainer" containerID="15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.971968 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pw5qs\" (UniqueName: \"kubernetes.io/projected/549eef45-c8c3-4375-b49a-9383c8d8525d-kube-api-access-pw5qs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972005 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972022 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972036 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972049 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972063 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/549eef45-c8c3-4375-b49a-9383c8d8525d-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.972153 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.972077 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/549eef45-c8c3-4375-b49a-9383c8d8525d-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:55:41.979478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.979453 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:55:41.984431 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.984399 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8b95844-msd7t"] Apr 24 21:55:41.999523 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:41.999502 2560 scope.go:117] "RemoveContainer" containerID="6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" Apr 24 21:55:42.008079 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.008059 2560 scope.go:117] "RemoveContainer" containerID="9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" Apr 24 21:55:42.008362 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:55:42.008343 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc\": container with ID starting with 9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc not found: ID does not exist" containerID="9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" Apr 24 21:55:42.008420 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.008373 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc"} err="failed to get container status \"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc\": rpc error: code = NotFound desc = could not find container \"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc\": container with ID starting with 9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc not found: ID does not exist" Apr 24 21:55:42.008420 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.008392 2560 scope.go:117] "RemoveContainer" containerID="15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64" Apr 24 21:55:42.008627 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:55:42.008610 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64\": container with ID starting with 15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64 not found: ID does not exist" containerID="15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64" Apr 24 21:55:42.008690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.008645 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64"} err="failed to get container status \"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64\": rpc error: code = NotFound desc = could not find container \"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64\": container with ID starting with 15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64 not found: ID does not exist" Apr 24 21:55:42.008690 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.008667 2560 scope.go:117] "RemoveContainer" containerID="6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" Apr 24 21:55:42.009016 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:55:42.008992 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d\": container with ID starting with 6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d not found: ID does not exist" containerID="6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" Apr 24 21:55:42.009082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009035 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d"} err="failed to get container status \"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d\": rpc error: code = NotFound desc = could not find container \"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d\": container with ID starting with 6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d not found: ID does not exist" Apr 24 21:55:42.009082 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009057 2560 scope.go:117] "RemoveContainer" containerID="9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc" Apr 24 21:55:42.009331 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009310 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc"} err="failed to get container status \"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc\": rpc error: code = NotFound desc = could not find container \"9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc\": container with ID starting with 9784719d02259ffae5383279fd8321432e72502e59edfb48b608d3fa809deebc not found: ID does not exist" Apr 24 21:55:42.009331 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009330 2560 scope.go:117] "RemoveContainer" containerID="15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64" Apr 24 21:55:42.009591 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009574 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64"} err="failed to get container status \"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64\": rpc error: code = NotFound desc = could not find container \"15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64\": container with ID starting with 15d56be8d71909dae49f955d511a6be0f09fcf785d47f24f702926724ad3df64 not found: ID does not exist" Apr 24 21:55:42.009658 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009591 2560 scope.go:117] "RemoveContainer" containerID="6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d" Apr 24 21:55:42.009816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009799 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d"} err="failed to get container status \"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d\": rpc error: code = NotFound desc = could not find container \"6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d\": container with ID starting with 6e023ab788e84358c07f3c9d962be9c4b3a6cc4902e1db166ecab637826e7f4d not found: ID does not exist" Apr 24 21:55:42.009816 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.009815 2560 scope.go:117] "RemoveContainer" containerID="88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1" Apr 24 21:55:42.059361 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.059331 2560 scope.go:117] "RemoveContainer" containerID="88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1" Apr 24 21:55:42.059647 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:55:42.059631 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1\": container with ID starting with 88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1 not found: ID does not exist" containerID="88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1" Apr 24 21:55:42.059711 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.059653 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1"} err="failed to get container status \"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1\": rpc error: code = NotFound desc = could not find container \"88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1\": container with ID starting with 88644662690b2d67ca289727b90a4e56ebd285a52f94abfb4dd7fc2403a242c1 not found: ID does not exist" Apr 24 21:55:42.249312 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.249251 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:55:42.252907 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.252864 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-585b44dc45-r4v84"] Apr 24 21:55:42.447722 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.447684 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b81b123-aae5-4625-97b5-b21d96f905a4" path="/var/lib/kubelet/pods/0b81b123-aae5-4625-97b5-b21d96f905a4/volumes" Apr 24 21:55:42.448267 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.448246 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" path="/var/lib/kubelet/pods/549eef45-c8c3-4375-b49a-9383c8d8525d/volumes" Apr 24 21:55:42.933610 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.933573 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerStarted","Data":"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a"} Apr 24 21:55:42.960541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:42.960495 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podStartSLOduration=7.960477191 podStartE2EDuration="7.960477191s" podCreationTimestamp="2026-04-24 21:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:42.957641971 +0000 UTC m=+2331.016429652" watchObservedRunningTime="2026-04-24 21:55:42.960477191 +0000 UTC m=+2331.019264864" Apr 24 21:55:45.895880 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:45.895832 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:45.940821 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:45.940778 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:45.941015 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:45.940824 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:55:45.942320 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:45.942284 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:55:55.895848 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:55.895800 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:55:55.941461 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:55.941413 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:55:55.962561 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:55:55.962530 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:56:05.895712 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:05.895662 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:56:05.940793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:05.940746 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:56:15.896308 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:15.896238 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 24 21:56:15.941209 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:15.941161 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:56:25.911863 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:25.911823 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:56:25.924508 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:25.924474 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:56:25.941404 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:25.941354 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:56:30.816713 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:30.816672 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:56:30.817220 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:30.817074 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" containerID="cri-o://5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c" gracePeriod=30 Apr 24 21:56:35.940663 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:35.940618 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:56:42.030569 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030535 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030840 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030853 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030875 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="llm-d-routing-sidecar" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030881 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="llm-d-routing-sidecar" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030886 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b81b123-aae5-4625-97b5-b21d96f905a4" containerName="storage-initializer" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030892 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b81b123-aae5-4625-97b5-b21d96f905a4" containerName="storage-initializer" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030897 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="storage-initializer" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030911 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="storage-initializer" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030976 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b81b123-aae5-4625-97b5-b21d96f905a4" containerName="storage-initializer" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.030990 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="llm-d-routing-sidecar" Apr 24 21:56:42.031041 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.031001 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="549eef45-c8c3-4375-b49a-9383c8d8525d" containerName="main" Apr 24 21:56:42.036960 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.036913 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.042726 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.042053 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-8n4vs\"" Apr 24 21:56:42.042726 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.042199 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 21:56:42.055446 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.055415 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:56:42.211234 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211202 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211411 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211249 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211411 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211273 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211411 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211332 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211527 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211454 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211527 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211490 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.211595 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.211520 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2vtz\" (UniqueName: \"kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313064 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.312959 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313064 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313008 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313064 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313038 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2vtz\" (UniqueName: \"kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313362 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313073 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313362 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313122 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313362 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313155 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313362 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313192 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313362 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313324 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313630 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313479 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313630 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313513 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.313710 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.313684 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.315528 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.315480 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.315667 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.315580 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.321071 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.321044 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2vtz\" (UniqueName: \"kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.349117 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.349083 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:56:42.493914 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:42.493883 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:56:42.496230 ip-10-0-128-142 kubenswrapper[2560]: W0424 21:56:42.496193 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32417d95_034a_40b2_9800_4fb6590145d4.slice/crio-3e94d4929268d4a7f487b66a87820a7997265ac186c618be65e2ae2f077970d3 WatchSource:0}: Error finding container 3e94d4929268d4a7f487b66a87820a7997265ac186c618be65e2ae2f077970d3: Status 404 returned error can't find the container with id 3e94d4929268d4a7f487b66a87820a7997265ac186c618be65e2ae2f077970d3 Apr 24 21:56:43.194375 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:43.194329 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerStarted","Data":"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c"} Apr 24 21:56:43.194825 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:43.194382 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerStarted","Data":"3e94d4929268d4a7f487b66a87820a7997265ac186c618be65e2ae2f077970d3"} Apr 24 21:56:45.941376 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:45.941324 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:56:52.544275 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:52.544240 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:56:52.545370 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:52.545346 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:56:52.549172 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:52.549151 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 21:56:52.550259 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:52.550241 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-128-142.ec2.internal_784ad9f5e87f1e75291c25fc06106e5e/kube-rbac-proxy-crio/2.log" Apr 24 21:56:55.940485 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:56:55.940436 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:57:01.117814 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.117784 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp_a027bf3c-14c7-4726-adc4-7588b3b3147c/main/0.log" Apr 24 21:57:01.118386 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.118250 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:57:01.169940 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.169891 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.169948 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.169979 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170063 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h24t\" (UniqueName: \"kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170120 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170117 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170336 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170152 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170336 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170225 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache\") pod \"a027bf3c-14c7-4726-adc4-7588b3b3147c\" (UID: \"a027bf3c-14c7-4726-adc4-7588b3b3147c\") " Apr 24 21:57:01.170834 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170754 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache" (OuterVolumeSpecName: "model-cache") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:01.170834 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.170766 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home" (OuterVolumeSpecName: "home") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:01.172325 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.172299 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm" (OuterVolumeSpecName: "dshm") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:01.172641 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.172620 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:01.172853 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.172831 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t" (OuterVolumeSpecName: "kube-api-access-2h24t") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "kube-api-access-2h24t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:01.189169 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.189129 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:01.238768 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.238714 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a027bf3c-14c7-4726-adc4-7588b3b3147c" (UID: "a027bf3c-14c7-4726-adc4-7588b3b3147c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:01.264233 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264200 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp_a027bf3c-14c7-4726-adc4-7588b3b3147c/main/0.log" Apr 24 21:57:01.264615 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264582 2560 generic.go:358] "Generic (PLEG): container finished" podID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerID="5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c" exitCode=137 Apr 24 21:57:01.264727 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264667 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" Apr 24 21:57:01.264793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264665 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerDied","Data":"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c"} Apr 24 21:57:01.264793 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264777 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp" event={"ID":"a027bf3c-14c7-4726-adc4-7588b3b3147c","Type":"ContainerDied","Data":"1575c7a57bfbda2f7b0b08feabb520588dbb79759ec3b215aa937bc49b7848f9"} Apr 24 21:57:01.264894 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.264795 2560 scope.go:117] "RemoveContainer" containerID="5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c" Apr 24 21:57:01.273797 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273763 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a027bf3c-14c7-4726-adc4-7588b3b3147c-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.273797 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273799 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.274136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273814 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.274136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273833 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h24t\" (UniqueName: \"kubernetes.io/projected/a027bf3c-14c7-4726-adc4-7588b3b3147c-kube-api-access-2h24t\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.274136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273846 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.274136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273860 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.274136 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.273874 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a027bf3c-14c7-4726-adc4-7588b3b3147c-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 21:57:01.281043 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.281013 2560 scope.go:117] "RemoveContainer" containerID="c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380" Apr 24 21:57:01.295246 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.295215 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:57:01.296775 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.296745 2560 scope.go:117] "RemoveContainer" containerID="5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c" Apr 24 21:57:01.297195 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:57:01.297168 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c\": container with ID starting with 5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c not found: ID does not exist" containerID="5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c" Apr 24 21:57:01.297267 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.297210 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c"} err="failed to get container status \"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c\": rpc error: code = NotFound desc = could not find container \"5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c\": container with ID starting with 5fe7b983147b8884820076e5b210336df44b9ab965ec66a42e722ec92f22d71c not found: ID does not exist" Apr 24 21:57:01.297267 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.297240 2560 scope.go:117] "RemoveContainer" containerID="c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380" Apr 24 21:57:01.297550 ip-10-0-128-142 kubenswrapper[2560]: E0424 21:57:01.297527 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380\": container with ID starting with c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380 not found: ID does not exist" containerID="c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380" Apr 24 21:57:01.297617 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.297563 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380"} err="failed to get container status \"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380\": rpc error: code = NotFound desc = could not find container \"c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380\": container with ID starting with c63b0b1568a921ca6b64400100ed904a3ae3a02dd738dcfcf3aa6e58f5e9f380 not found: ID does not exist" Apr 24 21:57:01.297724 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:01.297701 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d5fc8899959zdp"] Apr 24 21:57:02.445912 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:02.445873 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" path="/var/lib/kubelet/pods/a027bf3c-14c7-4726-adc4-7588b3b3147c/volumes" Apr 24 21:57:05.940676 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:05.940632 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:57:15.941490 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:15.941392 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 24 21:57:25.950964 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:25.950903 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:57:25.963296 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:25.963275 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 21:57:59.488058 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:59.487973 2560 generic.go:358] "Generic (PLEG): container finished" podID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerID="f42c731a08259f2d266042d428a368a3e54819caca0aba367892338967685ba0" exitCode=0 Apr 24 21:57:59.488478 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:59.488046 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerDied","Data":"f42c731a08259f2d266042d428a368a3e54819caca0aba367892338967685ba0"} Apr 24 21:57:59.489175 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:57:59.489160 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:58:00.493765 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:00.493732 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerStarted","Data":"f981bc7f80f18945075c15286cbc98fa3ab3d0e04b407986671524a57d27cc8b"} Apr 24 21:58:00.513679 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:00.513622 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podStartSLOduration=145.513606715 podStartE2EDuration="2m25.513606715s" podCreationTimestamp="2026-04-24 21:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:00.511622263 +0000 UTC m=+2468.570409937" watchObservedRunningTime="2026-04-24 21:58:00.513606715 +0000 UTC m=+2468.572394387" Apr 24 21:58:05.968190 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:05.968140 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:58:05.968190 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:05.968185 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 21:58:05.969431 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:05.969403 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:58:14.550563 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:14.550522 2560 generic.go:358] "Generic (PLEG): container finished" podID="32417d95-034a-40b2-9800-4fb6590145d4" containerID="6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c" exitCode=0 Apr 24 21:58:14.550563 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:14.550567 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerDied","Data":"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c"} Apr 24 21:58:15.557251 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:15.557213 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerStarted","Data":"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a"} Apr 24 21:58:15.580991 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:15.580945 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=94.580904296 podStartE2EDuration="1m34.580904296s" podCreationTimestamp="2026-04-24 21:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:15.578595678 +0000 UTC m=+2483.637383350" watchObservedRunningTime="2026-04-24 21:58:15.580904296 +0000 UTC m=+2483.639691969" Apr 24 21:58:15.968397 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:15.968293 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:58:22.349631 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:22.349592 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:22.351141 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:22.351105 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:58:25.969090 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:25.969054 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:58:32.349916 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:32.349867 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:58:35.968941 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:35.968894 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:58:42.349890 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:42.349800 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:42.350382 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:42.350200 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:58:45.968538 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:45.968490 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:58:52.350172 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:52.350124 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:58:55.968313 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:58:55.968263 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:02.349541 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:02.349486 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:05.969278 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:05.969227 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:12.350309 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:12.350263 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:15.968199 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:15.968159 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:22.350368 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:22.350329 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:25.968989 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:25.968947 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:32.350621 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:32.350575 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:35.969256 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:35.969208 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:42.350069 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:42.350012 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:45.968562 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:45.968522 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 21:59:52.350124 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:52.350081 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 21:59:55.969233 ip-10-0-128-142 kubenswrapper[2560]: I0424 21:59:55.969180 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 24 22:00:02.350000 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:02.349956 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 22:00:05.977856 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:05.977824 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 22:00:05.985471 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:05.985444 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 22:00:12.349974 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:12.349867 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 24 22:00:22.369139 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:22.369092 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:00:22.379912 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:22.379879 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:00:24.045729 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:24.045378 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 22:00:24.046715 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:24.046651 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" containerID="cri-o://f981bc7f80f18945075c15286cbc98fa3ab3d0e04b407986671524a57d27cc8b" gracePeriod=30 Apr 24 22:00:24.048988 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:24.048960 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 22:00:24.049510 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:24.049432 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" containerID="cri-o://818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a" gracePeriod=30 Apr 24 22:00:30.117520 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:30.117484 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:00:30.117947 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:30.117795 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" containerID="cri-o://5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a" gracePeriod=30 Apr 24 22:00:30.861215 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:30.861189 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:00:31.010713 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010673 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010713 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010716 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010742 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010780 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010803 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2vtz\" (UniqueName: \"kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010828 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.010996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.010860 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache\") pod \"32417d95-034a-40b2-9800-4fb6590145d4\" (UID: \"32417d95-034a-40b2-9800-4fb6590145d4\") " Apr 24 22:00:31.011305 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.011274 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.011588 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.011557 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home" (OuterVolumeSpecName: "home") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.013051 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.013019 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm" (OuterVolumeSpecName: "dshm") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.013161 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.013056 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz" (OuterVolumeSpecName: "kube-api-access-f2vtz") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "kube-api-access-f2vtz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:31.013161 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.013106 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:31.030554 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.030502 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.074876 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.074823 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "32417d95-034a-40b2-9800-4fb6590145d4" (UID: "32417d95-034a-40b2-9800-4fb6590145d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.101432 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.101398 2560 generic.go:358] "Generic (PLEG): container finished" podID="32417d95-034a-40b2-9800-4fb6590145d4" containerID="5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a" exitCode=0 Apr 24 22:00:31.101626 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.101470 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:00:31.101626 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.101486 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerDied","Data":"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a"} Apr 24 22:00:31.101626 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.101535 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"32417d95-034a-40b2-9800-4fb6590145d4","Type":"ContainerDied","Data":"3e94d4929268d4a7f487b66a87820a7997265ac186c618be65e2ae2f077970d3"} Apr 24 22:00:31.101626 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.101557 2560 scope.go:117] "RemoveContainer" containerID="5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a" Apr 24 22:00:31.111206 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111183 2560 scope.go:117] "RemoveContainer" containerID="6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c" Apr 24 22:00:31.111467 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111449 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111471 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32417d95-034a-40b2-9800-4fb6590145d4-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111480 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111488 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111496 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111504 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2vtz\" (UniqueName: \"kubernetes.io/projected/32417d95-034a-40b2-9800-4fb6590145d4-kube-api-access-f2vtz\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.111527 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.111513 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32417d95-034a-40b2-9800-4fb6590145d4-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.125258 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.125233 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:00:31.128550 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.128440 2560 scope.go:117] "RemoveContainer" containerID="5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a" Apr 24 22:00:31.128775 ip-10-0-128-142 kubenswrapper[2560]: E0424 22:00:31.128755 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a\": container with ID starting with 5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a not found: ID does not exist" containerID="5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a" Apr 24 22:00:31.128823 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.128786 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a"} err="failed to get container status \"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a\": rpc error: code = NotFound desc = could not find container \"5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a\": container with ID starting with 5db810437ce43c86955a5f05b6d068aaa3742a5e44eed0bd7f8cf5a1c57aa67a not found: ID does not exist" Apr 24 22:00:31.128823 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.128805 2560 scope.go:117] "RemoveContainer" containerID="6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c" Apr 24 22:00:31.129095 ip-10-0-128-142 kubenswrapper[2560]: E0424 22:00:31.129073 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c\": container with ID starting with 6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c not found: ID does not exist" containerID="6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c" Apr 24 22:00:31.129176 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.129108 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c"} err="failed to get container status \"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c\": rpc error: code = NotFound desc = could not find container \"6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c\": container with ID starting with 6070ea4083a2162371cb206bcd80c976e50410b14bd12ad89657312347391f5c not found: ID does not exist" Apr 24 22:00:31.129728 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:31.129707 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:00:32.445251 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:32.445216 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32417d95-034a-40b2-9800-4fb6590145d4" path="/var/lib/kubelet/pods/32417d95-034a-40b2-9800-4fb6590145d4/volumes" Apr 24 22:00:45.651026 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.650990 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:45.705966 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.705917 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:45.714954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.714904 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:45.725312 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.725269 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:45.745133 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.745098 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:45.754367 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:45.754339 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:46.705374 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.705344 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:46.747758 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.747726 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:46.756578 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.756541 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:46.767661 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.767634 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:46.789174 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.789141 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:46.796367 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:46.796346 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:47.768509 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.768477 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:47.807555 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.807516 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:47.815456 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.815434 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:47.825740 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.825713 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:47.844842 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.844817 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:47.853458 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:47.853436 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:48.807298 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.807273 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:48.848684 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.848660 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:48.856613 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.856596 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:48.866790 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.866768 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:48.885790 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.885770 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:48.897907 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:48.897886 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:49.860551 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.860516 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:49.904784 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.904743 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:49.913341 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.913317 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:49.923581 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.923557 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:49.943279 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.943259 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:49.952640 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:49.952620 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:50.909862 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.909809 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:50.952296 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.952268 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:50.961045 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.961011 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:50.971288 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.971249 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:50.991884 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.991858 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:50.998984 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:50.998964 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:51.948395 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:51.948370 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:51.988416 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:51.988388 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:51.996907 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:51.996881 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:52.007724 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:52.007695 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:52.033142 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:52.033122 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:52.039721 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:52.039699 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:52.991867 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:52.991794 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:53.030412 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:53.030384 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:53.038689 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:53.038669 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:53.048285 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:53.048267 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:53.066821 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:53.066786 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:53.073503 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:53.073485 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:54.047032 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.047000 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:54.049753 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.049728 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="llm-d-routing-sidecar" containerID="cri-o://c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc" gracePeriod=2 Apr 24 22:00:54.092553 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.092520 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:54.100871 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.100846 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:54.110153 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.110131 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:54.129201 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.129165 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:54.137900 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.137871 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:54.193680 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.193656 2560 generic.go:358] "Generic (PLEG): container finished" podID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerID="f981bc7f80f18945075c15286cbc98fa3ab3d0e04b407986671524a57d27cc8b" exitCode=137 Apr 24 22:00:54.193792 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.193728 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerDied","Data":"f981bc7f80f18945075c15286cbc98fa3ab3d0e04b407986671524a57d27cc8b"} Apr 24 22:00:54.195359 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.195341 2560 generic.go:358] "Generic (PLEG): container finished" podID="942f3445-f5e3-48c9-a901-64599b5da99a" containerID="c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc" exitCode=0 Apr 24 22:00:54.195440 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.195413 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerDied","Data":"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc"} Apr 24 22:00:54.396462 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.396441 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 22:00:54.399163 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.399146 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:54.399758 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.399742 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 22:00:54.406800 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406783 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57hrl\" (UniqueName: \"kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.406871 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406821 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.406871 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406846 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.406871 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406867 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406891 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9czfl\" (UniqueName: \"kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406911 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406960 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.406988 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407018 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407047 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407039 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407357 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407067 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407357 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407102 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.407357 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407126 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs\") pod \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\" (UID: \"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9\") " Apr 24 22:00:54.407357 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407155 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home\") pod \"942f3445-f5e3-48c9-a901-64599b5da99a\" (UID: \"942f3445-f5e3-48c9-a901-64599b5da99a\") " Apr 24 22:00:54.407845 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.407818 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache" (OuterVolumeSpecName: "model-cache") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.408186 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.408162 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home" (OuterVolumeSpecName: "home") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.408276 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.408251 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache" (OuterVolumeSpecName: "model-cache") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.408892 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.408869 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home" (OuterVolumeSpecName: "home") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.409818 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.409790 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm" (OuterVolumeSpecName: "dshm") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.411339 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.411308 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl" (OuterVolumeSpecName: "kube-api-access-9czfl") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "kube-api-access-9czfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:54.411827 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.411807 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm" (OuterVolumeSpecName: "dshm") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.411988 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.411822 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:54.412383 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.412358 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:54.412492 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.412390 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl" (OuterVolumeSpecName: "kube-api-access-57hrl") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "kube-api-access-57hrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:54.424360 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.424333 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.429272 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.429249 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.472534 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.472502 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "942f3445-f5e3-48c9-a901-64599b5da99a" (UID: "942f3445-f5e3-48c9-a901-64599b5da99a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.482137 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.482108 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" (UID: "b03ba359-fc1b-4bea-90ae-91f8ffdf77c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:54.507668 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507647 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507668 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507668 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507677 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507685 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507693 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57hrl\" (UniqueName: \"kubernetes.io/projected/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kube-api-access-57hrl\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507703 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507712 2560 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-model-cache\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507720 2560 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/942f3445-f5e3-48c9-a901-64599b5da99a-tls-certs\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507728 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9czfl\" (UniqueName: \"kubernetes.io/projected/942f3445-f5e3-48c9-a901-64599b5da99a-kube-api-access-9czfl\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507737 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-kserve-provision-location\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507745 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/942f3445-f5e3-48c9-a901-64599b5da99a-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507753 2560 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-home\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507761 2560 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-tmp-dir\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:54.507795 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:54.507769 2560 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9-dshm\") on node \"ip-10-0-128-142.ec2.internal\" DevicePath \"\"" Apr 24 22:00:55.103823 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.103798 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:55.148739 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.148715 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:55.157825 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.157801 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/llm-d-routing-sidecar/0.log" Apr 24 22:00:55.169501 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.169481 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/storage-initializer/0.log" Apr 24 22:00:55.190107 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.190084 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/main/0.log" Apr 24 22:00:55.197191 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.197172 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq_b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/storage-initializer/0.log" Apr 24 22:00:55.200271 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.200246 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" event={"ID":"b03ba359-fc1b-4bea-90ae-91f8ffdf77c9","Type":"ContainerDied","Data":"876aeae99a065f181ae7f1125ad2b841be0a150082b5299c6e977f9291546329"} Apr 24 22:00:55.200392 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.200287 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq" Apr 24 22:00:55.200392 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.200293 2560 scope.go:117] "RemoveContainer" containerID="f981bc7f80f18945075c15286cbc98fa3ab3d0e04b407986671524a57d27cc8b" Apr 24 22:00:55.201680 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.201661 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8_942f3445-f5e3-48c9-a901-64599b5da99a/main/0.log" Apr 24 22:00:55.202417 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.202391 2560 generic.go:358] "Generic (PLEG): container finished" podID="942f3445-f5e3-48c9-a901-64599b5da99a" containerID="818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a" exitCode=137 Apr 24 22:00:55.202498 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.202469 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerDied","Data":"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a"} Apr 24 22:00:55.202498 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.202493 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" event={"ID":"942f3445-f5e3-48c9-a901-64599b5da99a","Type":"ContainerDied","Data":"917b5a6965544d345fcc0d68412e1a5dd8c06cf748673846397c4f1d82eb5265"} Apr 24 22:00:55.202587 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.202570 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8" Apr 24 22:00:55.210291 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.210274 2560 scope.go:117] "RemoveContainer" containerID="f42c731a08259f2d266042d428a368a3e54819caca0aba367892338967685ba0" Apr 24 22:00:55.220494 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.220481 2560 scope.go:117] "RemoveContainer" containerID="818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a" Apr 24 22:00:55.226871 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.226846 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 22:00:55.228190 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.228172 2560 scope.go:117] "RemoveContainer" containerID="f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc" Apr 24 22:00:55.231170 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.231146 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-b694fbd9b-m86qq"] Apr 24 22:00:55.242552 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.242530 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 22:00:55.247384 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.247364 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5bfbbd4669-ff7n8"] Apr 24 22:00:55.288676 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.288654 2560 scope.go:117] "RemoveContainer" containerID="c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc" Apr 24 22:00:55.295652 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.295635 2560 scope.go:117] "RemoveContainer" containerID="818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a" Apr 24 22:00:55.295894 ip-10-0-128-142 kubenswrapper[2560]: E0424 22:00:55.295874 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a\": container with ID starting with 818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a not found: ID does not exist" containerID="818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a" Apr 24 22:00:55.295967 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.295902 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a"} err="failed to get container status \"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a\": rpc error: code = NotFound desc = could not find container \"818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a\": container with ID starting with 818ac706685efbd13013bcb207f0a90b8d856b7a8f1fcc2b825c94838f12486a not found: ID does not exist" Apr 24 22:00:55.295967 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.295918 2560 scope.go:117] "RemoveContainer" containerID="f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc" Apr 24 22:00:55.296162 ip-10-0-128-142 kubenswrapper[2560]: E0424 22:00:55.296143 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc\": container with ID starting with f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc not found: ID does not exist" containerID="f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc" Apr 24 22:00:55.296206 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.296165 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc"} err="failed to get container status \"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc\": rpc error: code = NotFound desc = could not find container \"f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc\": container with ID starting with f282ac03f107a9983809123c54c0e47a59239e42f29bf0ce27aa6735f068c4dc not found: ID does not exist" Apr 24 22:00:55.296206 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.296178 2560 scope.go:117] "RemoveContainer" containerID="c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc" Apr 24 22:00:55.296396 ip-10-0-128-142 kubenswrapper[2560]: E0424 22:00:55.296379 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc\": container with ID starting with c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc not found: ID does not exist" containerID="c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc" Apr 24 22:00:55.296442 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:55.296398 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc"} err="failed to get container status \"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc\": rpc error: code = NotFound desc = could not find container \"c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc\": container with ID starting with c85d07ae2c7e244c1aa5c4a10b47a932cf69db7b9496ea42457686fcd491febc not found: ID does not exist" Apr 24 22:00:56.207507 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:56.207478 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:56.445557 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:56.445524 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" path="/var/lib/kubelet/pods/942f3445-f5e3-48c9-a901-64599b5da99a/volumes" Apr 24 22:00:56.446064 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:56.446042 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" path="/var/lib/kubelet/pods/b03ba359-fc1b-4bea-90ae-91f8ffdf77c9/volumes" Apr 24 22:00:57.303394 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:57.303369 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:58.245109 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:58.245085 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:00:59.181777 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:00:59.181752 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cm2xx_07c9262b-512b-4a5e-991a-8562666209e4/istio-proxy/0.log" Apr 24 22:01:01.749711 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:01.749681 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-p8zxf_a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9/authorino/0.log" Apr 24 22:01:01.789905 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:01.789881 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-c22xt_0b22e842-e916-4964-b678-a49174bc3ed7/kuadrant-console-plugin/0.log" Apr 24 22:01:04.158823 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.158783 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xhf46/must-gather-5jw29"] Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159221 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159240 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159249 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159257 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159271 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159279 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159287 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159294 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159314 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159321 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="storage-initializer" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159334 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="llm-d-routing-sidecar" Apr 24 22:01:04.159338 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159342 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="llm-d-routing-sidecar" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159356 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159363 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159371 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159379 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159395 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="storage-initializer" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159404 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="storage-initializer" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159479 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="a027bf3c-14c7-4726-adc4-7588b3b3147c" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159494 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="llm-d-routing-sidecar" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159506 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="32417d95-034a-40b2-9800-4fb6590145d4" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159516 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="b03ba359-fc1b-4bea-90ae-91f8ffdf77c9" containerName="main" Apr 24 22:01:04.159954 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.159527 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="942f3445-f5e3-48c9-a901-64599b5da99a" containerName="main" Apr 24 22:01:04.162812 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.162791 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.166575 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.166557 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xhf46\"/\"default-dockercfg-p8nxl\"" Apr 24 22:01:04.166680 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.166625 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhf46\"/\"kube-root-ca.crt\"" Apr 24 22:01:04.166957 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.166915 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhf46\"/\"openshift-service-ca.crt\"" Apr 24 22:01:04.174258 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.174237 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/must-gather-5jw29"] Apr 24 22:01:04.175511 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.175473 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-must-gather-output\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.175646 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.175537 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vlcq\" (UniqueName: \"kubernetes.io/projected/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-kube-api-access-5vlcq\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.276094 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.276053 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-must-gather-output\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.276252 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.276114 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vlcq\" (UniqueName: \"kubernetes.io/projected/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-kube-api-access-5vlcq\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.276437 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.276417 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-must-gather-output\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.283744 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.283717 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vlcq\" (UniqueName: \"kubernetes.io/projected/7f4e236c-92ba-4eef-a9a6-5b7b501f70a2-kube-api-access-5vlcq\") pod \"must-gather-5jw29\" (UID: \"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2\") " pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.471551 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.471476 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/must-gather-5jw29" Apr 24 22:01:04.588112 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:04.588086 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/must-gather-5jw29"] Apr 24 22:01:04.589805 ip-10-0-128-142 kubenswrapper[2560]: W0424 22:01:04.589768 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f4e236c_92ba_4eef_a9a6_5b7b501f70a2.slice/crio-c4c6c5a36a0865fe07b16f9af5c138f44268c0e5164e4a0e9e6146a841182558 WatchSource:0}: Error finding container c4c6c5a36a0865fe07b16f9af5c138f44268c0e5164e4a0e9e6146a841182558: Status 404 returned error can't find the container with id c4c6c5a36a0865fe07b16f9af5c138f44268c0e5164e4a0e9e6146a841182558 Apr 24 22:01:05.239644 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:05.239598 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/must-gather-5jw29" event={"ID":"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2","Type":"ContainerStarted","Data":"c4c6c5a36a0865fe07b16f9af5c138f44268c0e5164e4a0e9e6146a841182558"} Apr 24 22:01:06.250145 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:06.250100 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/must-gather-5jw29" event={"ID":"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2","Type":"ContainerStarted","Data":"fe91b7913c55f73c020ed4dcf79005e2ef26feaa38ca5e0805ef016513924cd9"} Apr 24 22:01:06.250145 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:06.250152 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/must-gather-5jw29" event={"ID":"7f4e236c-92ba-4eef-a9a6-5b7b501f70a2","Type":"ContainerStarted","Data":"d0b7a40c2994c5882f1ee243204bd9dbba6b121efb37b16ff9cffd71fc2fcf5f"} Apr 24 22:01:06.266479 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:06.266427 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xhf46/must-gather-5jw29" podStartSLOduration=1.417254467 podStartE2EDuration="2.266408208s" podCreationTimestamp="2026-04-24 22:01:04 +0000 UTC" firstStartedPulling="2026-04-24 22:01:04.591537302 +0000 UTC m=+2652.650324953" lastFinishedPulling="2026-04-24 22:01:05.440691044 +0000 UTC m=+2653.499478694" observedRunningTime="2026-04-24 22:01:06.265051737 +0000 UTC m=+2654.323839416" watchObservedRunningTime="2026-04-24 22:01:06.266408208 +0000 UTC m=+2654.325195881" Apr 24 22:01:06.957715 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:06.957677 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lgtm8_8f7f49cd-0b2f-4225-a283-3ee2a6d74934/global-pull-secret-syncer/0.log" Apr 24 22:01:07.131245 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:07.131212 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t6vpk_05f6587b-f5da-428a-8968-80f271212138/konnectivity-agent/0.log" Apr 24 22:01:07.154276 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:07.154232 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-142.ec2.internal_97ad1a8b57a0086b081c576d14dd05e7/haproxy/0.log" Apr 24 22:01:11.000451 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:11.000395 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-p8zxf_a2ae7efd-fb4c-4d74-8d95-c0a8961a47e9/authorino/0.log" Apr 24 22:01:11.097382 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:11.097339 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-c22xt_0b22e842-e916-4964-b678-a49174bc3ed7/kuadrant-console-plugin/0.log" Apr 24 22:01:12.572179 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:12.572138 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nk67g_076d7d40-dc29-4871-a077-5e08e9154463/node-exporter/0.log" Apr 24 22:01:12.593241 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:12.593216 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nk67g_076d7d40-dc29-4871-a077-5e08e9154463/kube-rbac-proxy/0.log" Apr 24 22:01:12.612613 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:12.612590 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nk67g_076d7d40-dc29-4871-a077-5e08e9154463/init-textfile/0.log" Apr 24 22:01:15.880662 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.880629 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl"] Apr 24 22:01:15.888139 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.888115 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:15.892358 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.892322 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl"] Apr 24 22:01:15.984432 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.984397 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-proc\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:15.984607 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.984456 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-sys\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:15.984607 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.984547 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-lib-modules\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:15.984607 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.984582 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-podres\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:15.984712 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:15.984663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpns\" (UniqueName: \"kubernetes.io/projected/2977d8c0-16b2-45ba-af92-d5bf77a79539-kube-api-access-xfpns\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.085898 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.085863 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpns\" (UniqueName: \"kubernetes.io/projected/2977d8c0-16b2-45ba-af92-d5bf77a79539-kube-api-access-xfpns\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086068 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.085913 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-proc\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086068 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.085969 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-sys\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086068 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.085996 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-lib-modules\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086068 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.086014 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-podres\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086209 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.086066 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-proc\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086209 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.086077 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-sys\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086209 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.086146 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-podres\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.086209 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.086156 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2977d8c0-16b2-45ba-af92-d5bf77a79539-lib-modules\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.098171 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.098145 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpns\" (UniqueName: \"kubernetes.io/projected/2977d8c0-16b2-45ba-af92-d5bf77a79539-kube-api-access-xfpns\") pod \"perf-node-gather-daemonset-zjtxl\" (UID: \"2977d8c0-16b2-45ba-af92-d5bf77a79539\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.199271 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.199205 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:16.548214 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.548128 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl"] Apr 24 22:01:16.550164 ip-10-0-128-142 kubenswrapper[2560]: W0424 22:01:16.550137 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2977d8c0_16b2_45ba_af92_d5bf77a79539.slice/crio-ee5a8fd2bd925061b23e8dcdd08b30363b52ca4f1f5dd0fb3a9df20960c7a2af WatchSource:0}: Error finding container ee5a8fd2bd925061b23e8dcdd08b30363b52ca4f1f5dd0fb3a9df20960c7a2af: Status 404 returned error can't find the container with id ee5a8fd2bd925061b23e8dcdd08b30363b52ca4f1f5dd0fb3a9df20960c7a2af Apr 24 22:01:16.647769 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.647744 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cx2qc_a282a503-61d3-4d88-bb58-1c4989fe6bd8/dns/0.log" Apr 24 22:01:16.667966 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.667949 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cx2qc_a282a503-61d3-4d88-bb58-1c4989fe6bd8/kube-rbac-proxy/0.log" Apr 24 22:01:16.819782 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:16.819714 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-x6t58_b4deb568-1692-4033-a2a6-45866b8c89db/dns-node-resolver/0.log" Apr 24 22:01:17.298908 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:17.298874 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" event={"ID":"2977d8c0-16b2-45ba-af92-d5bf77a79539","Type":"ContainerStarted","Data":"d1127f199fe64f411513e6f43a85674fc6da57eca75dd4f7693d094f844d90f1"} Apr 24 22:01:17.298908 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:17.298913 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" event={"ID":"2977d8c0-16b2-45ba-af92-d5bf77a79539","Type":"ContainerStarted","Data":"ee5a8fd2bd925061b23e8dcdd08b30363b52ca4f1f5dd0fb3a9df20960c7a2af"} Apr 24 22:01:17.299337 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:17.298972 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:17.313712 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:17.313667 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" podStartSLOduration=2.313654865 podStartE2EDuration="2.313654865s" podCreationTimestamp="2026-04-24 22:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:01:17.31351803 +0000 UTC m=+2665.372305702" watchObservedRunningTime="2026-04-24 22:01:17.313654865 +0000 UTC m=+2665.372442538" Apr 24 22:01:17.365042 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:17.365016 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r5z5s_ce514c8a-dcef-4be4-9c59-0a9305bef822/node-ca/0.log" Apr 24 22:01:18.613025 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:18.612989 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gdwvv_cc47cff2-ad26-4f59-b1d8-6831f76de599/serve-healthcheck-canary/0.log" Apr 24 22:01:19.095982 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:19.095948 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5dm7q_6d07634f-dd2f-4257-8023-4cc8ec27fae2/kube-rbac-proxy/0.log" Apr 24 22:01:19.115313 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:19.115290 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5dm7q_6d07634f-dd2f-4257-8023-4cc8ec27fae2/exporter/0.log" Apr 24 22:01:19.136595 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:19.136573 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5dm7q_6d07634f-dd2f-4257-8023-4cc8ec27fae2/extractor/0.log" Apr 24 22:01:21.691914 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:21.691882 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7f997f587c-sbdmj_43f237ab-989c-487f-90ed-d10683e377ab/manager/0.log" Apr 24 22:01:21.714055 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:21.714029 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-w7ggm_486f6b20-c896-4f8a-8188-84a01590a54a/openshift-lws-operator/0.log" Apr 24 22:01:22.668996 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:22.668967 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-8h7vn_94e3db84-8dcf-4c23-920a-cdf7da63973e/s3-init/0.log" Apr 24 22:01:22.698565 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:22.698536 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-4xfmc_c24a9416-1b41-4f60-bf4e-9cc6f95eac37/seaweedfs/0.log" Apr 24 22:01:23.311888 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:23.311862 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-zjtxl" Apr 24 22:01:28.609350 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.609321 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/kube-multus-additional-cni-plugins/0.log" Apr 24 22:01:28.630352 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.630324 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/egress-router-binary-copy/0.log" Apr 24 22:01:28.650842 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.650822 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/cni-plugins/0.log" Apr 24 22:01:28.669608 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.669584 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/bond-cni-plugin/0.log" Apr 24 22:01:28.688605 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.688581 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/routeoverride-cni/0.log" Apr 24 22:01:28.708102 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.708086 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/whereabouts-cni-bincopy/0.log" Apr 24 22:01:28.726719 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.726692 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqxwj_849dc6e4-06dd-4834-a5eb-de6ceebd649f/whereabouts-cni/0.log" Apr 24 22:01:28.906961 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.906869 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-df2mh_d0fb6fb7-0535-4776-8fae-ef83f9bfdcca/kube-multus/0.log" Apr 24 22:01:28.962046 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.962021 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ldwdd_23cacaf9-66cf-483e-89f1-70f1b4c3cc3c/network-metrics-daemon/0.log" Apr 24 22:01:28.979064 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:28.979039 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ldwdd_23cacaf9-66cf-483e-89f1-70f1b4c3cc3c/kube-rbac-proxy/0.log" Apr 24 22:01:29.856600 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.856569 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-controller/0.log" Apr 24 22:01:29.874172 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.874148 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/0.log" Apr 24 22:01:29.897172 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.897154 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovn-acl-logging/1.log" Apr 24 22:01:29.913494 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.913477 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/kube-rbac-proxy-node/0.log" Apr 24 22:01:29.935477 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.935457 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:01:29.956611 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.956591 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/northd/0.log" Apr 24 22:01:29.976314 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.976296 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/nbdb/0.log" Apr 24 22:01:29.996702 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:29.996683 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/sbdb/0.log" Apr 24 22:01:30.242440 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:30.242359 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rbsp_260be1ca-939b-4b9e-9f93-078d2506aef0/ovnkube-controller/0.log" Apr 24 22:01:31.971468 ip-10-0-128-142 kubenswrapper[2560]: I0424 22:01:31.971438 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qbhf5_68f2b2b0-7441-4845-8db7-2d2bdb770218/network-check-target-container/0.log"