Apr 17 17:21:40.978347 ip-10-0-139-84 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:21:40.978358 ip-10-0-139-84 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:21:40.978368 ip-10-0-139-84 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:21:40.978670 ip-10-0-139-84 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:21:51.132585 ip-10-0-139-84 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:21:51.132600 ip-10-0-139-84 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8afd030ef5a54fefa7cd23193aa39257 -- Apr 17 17:24:14.585337 ip-10-0-139-84 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:14.965936 ip-10-0-139-84 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:14.965936 ip-10-0-139-84 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:14.965936 ip-10-0-139-84 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:14.965936 ip-10-0-139-84 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:14.965936 ip-10-0-139-84 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:14.967315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.967225 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:14.970333 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970312 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:14.970333 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970330 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:14.970333 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970337 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:14.970333 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970341 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970345 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970350 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970353 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970357 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970361 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970365 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970368 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970375 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970378 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970382 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970386 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970394 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970399 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970403 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970407 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970411 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970415 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970418 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:14.970596 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970422 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970427 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970430 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970434 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970438 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970442 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970445 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970450 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970454 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970458 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970463 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970467 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970471 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970475 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970479 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970485 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970489 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970494 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970499 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970503 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:14.971202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970508 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970513 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970517 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970521 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970525 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970531 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970539 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970544 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970549 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970554 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970559 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970563 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970567 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970571 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970575 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970580 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970584 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970588 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970592 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:14.971736 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970596 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970600 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970604 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970608 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970614 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970618 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970622 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970627 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970632 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970636 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970641 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970645 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970650 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970655 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970659 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970663 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970667 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970671 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970676 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970680 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:14.972577 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970684 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970688 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970692 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970696 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.970701 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972357 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972369 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972374 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972379 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972384 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972388 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972393 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972397 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972403 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972407 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972412 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972416 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972420 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972425 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972430 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:14.973451 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972435 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972440 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972444 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972448 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972453 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972468 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972472 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972477 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972481 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972485 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972490 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972494 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972498 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972502 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972506 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972511 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972515 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972519 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972523 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:14.974316 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972530 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972536 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972541 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972545 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972550 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972554 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972559 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972563 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972568 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972572 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972578 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972581 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972585 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972590 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972595 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972601 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972607 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972612 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972616 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:14.974808 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972621 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972626 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972630 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972634 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972638 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972642 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972646 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972650 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972654 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972658 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972662 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972667 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972671 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972675 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972680 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972684 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972688 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972692 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972697 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972701 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:14.975380 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972705 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972710 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972714 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972718 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972724 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972728 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972732 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972737 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972741 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972745 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972749 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972753 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.972757 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972863 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972874 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972884 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972890 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972897 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972903 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972910 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972917 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:14.976051 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972922 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972926 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972932 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972938 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972943 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972948 2573 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972953 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972957 2573 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972962 2573 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972967 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972972 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972978 2573 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972983 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972988 2573 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972993 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.972999 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973005 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973010 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973035 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973041 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973045 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973050 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973055 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973060 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973065 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:14.976677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973073 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973078 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973083 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973088 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973093 2573 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973098 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973106 2573 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973111 2573 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973116 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973121 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973125 2573 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973132 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973137 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973142 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973147 2573 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973152 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973157 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973162 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973167 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973171 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973176 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973181 2573 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973188 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973193 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973198 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:14.977424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973203 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973208 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973213 2573 flags.go:64] FLAG: --help="false" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973218 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-139-84.ec2.internal" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973223 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973228 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973233 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973238 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973245 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973250 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973254 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973259 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973264 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973269 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973274 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973279 2573 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973284 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973288 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973293 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973298 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973302 2573 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973307 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973312 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973317 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:14.978163 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973326 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973331 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973335 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973340 2573 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973346 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973352 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973357 2573 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973362 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973369 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973373 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973380 2573 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973385 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973390 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973395 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973400 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973405 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973410 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973440 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973463 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973468 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973472 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973477 2573 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973482 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:14.978768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973492 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973497 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973502 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973507 2573 flags.go:64] FLAG: --port="10250" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973512 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973517 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-024623882f890cd71" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973523 2573 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973528 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973534 2573 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973539 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973544 2573 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973550 2573 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973555 2573 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973559 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973564 2573 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973570 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973575 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973581 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973585 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973590 2573 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973595 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973600 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973605 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973609 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973614 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973619 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:14.979337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973625 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973631 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973635 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973640 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973644 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973649 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973654 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973659 2573 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973664 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973672 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973677 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973682 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973689 2573 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973694 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973698 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973704 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973708 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973713 2573 flags.go:64] FLAG: --v="2" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973720 2573 flags.go:64] FLAG: --version="false" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973726 2573 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973733 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.973739 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973907 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973914 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:14.979967 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973918 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973922 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973927 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973931 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973935 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973939 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973943 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973949 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973953 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973957 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973962 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973966 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973970 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973974 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973978 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973982 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973987 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973991 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973995 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.973999 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:14.980672 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974003 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974007 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974030 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974035 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974040 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974044 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974048 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974053 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974057 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974061 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974065 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974070 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974074 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974079 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974083 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974088 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974092 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974096 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974100 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974105 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:14.981182 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974109 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974114 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974118 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974123 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974127 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974131 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974135 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974140 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974144 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974148 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974152 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974156 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974160 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974164 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974170 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974175 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974179 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974183 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974187 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974195 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:14.981681 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974200 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974204 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974209 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974215 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974221 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974227 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974232 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974236 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974241 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974245 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974250 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974257 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974262 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974267 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974272 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974277 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974282 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974287 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974291 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:14.982191 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974296 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974300 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974304 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974308 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.974312 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.975052 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.981368 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.981382 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981427 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981432 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981435 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981439 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981442 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981445 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981449 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:14.982673 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981452 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981454 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981458 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981461 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981463 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981466 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981469 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981472 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981474 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981478 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981483 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981487 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981490 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981493 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981496 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981499 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981501 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981504 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981507 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:14.983060 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981509 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981512 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981515 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981517 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981520 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981524 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981527 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981530 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981533 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981536 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981539 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981542 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981544 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981547 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981549 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981552 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981554 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981557 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981560 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981563 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:14.983521 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981565 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981568 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981571 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981573 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981576 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981579 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981581 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981584 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981586 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981589 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981592 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981594 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981597 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981600 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981602 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981605 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981607 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981610 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981614 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981617 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:14.984116 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981619 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981622 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981624 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981627 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981630 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981632 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981635 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981637 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981640 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981642 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981645 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981647 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981650 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981653 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981655 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981659 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981663 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981666 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981668 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:14.984616 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981671 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.981676 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981799 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981805 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981808 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981811 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981814 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981818 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981821 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981823 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981826 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981829 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981838 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981841 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981845 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:14.985080 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981849 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981852 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981855 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981858 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981861 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981864 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981866 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981869 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981872 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981874 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981877 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981879 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981882 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981885 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981887 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981890 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981893 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981895 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981898 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981901 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:14.985455 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981903 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981905 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981908 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981911 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981913 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981916 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981920 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981924 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981927 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981929 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981940 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981943 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981945 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981948 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981950 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981953 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981955 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981958 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981960 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:14.985940 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981963 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981965 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981968 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981971 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981973 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981976 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981978 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981981 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981984 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981986 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981989 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981991 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981994 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981996 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.981999 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982001 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982003 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982006 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982009 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982011 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982028 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:14.986409 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982031 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982034 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982036 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982045 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982048 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982050 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982053 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982055 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982057 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982060 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982062 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982065 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:14.982067 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.982072 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.982715 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:14.986899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.986081 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:14.987291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.986874 2573 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:14.987291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.986991 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:14.987291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:14.987035 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:15.008312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.008293 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:15.012642 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.012619 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:15.026812 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.026795 2573 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:15.031811 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.031795 2573 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:15.033006 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.032993 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:15.035674 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.035650 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 97e6af25-83d4-4a78-bad3-043ac0b2fe40:/dev/nvme0n1p3 f54fe672-ed98-4678-98a0-2c3955e3b515:/dev/nvme0n1p4] Apr 17 17:24:15.035750 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.035673 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:15.037889 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.037867 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:15.040704 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.040588 2573 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:15.039552226 +0000 UTC m=+0.355634058 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3109675 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28a6b4544b89d8cee7b552da971495 SystemUUID:ec28a6b4-544b-89d8-cee7-b552da971495 BootID:8afd030e-f5a5-4fef-a7cd-23193aa39257 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e4:bb:a2:16:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e4:bb:a2:16:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:40:ea:fe:c3:42 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:15.040704 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.040693 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:15.040862 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.040800 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:15.043056 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043028 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:15.043226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043057 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-84.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:15.043315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043240 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:15.043315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043254 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:15.043315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043277 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:15.043315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.043299 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:15.044177 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.044165 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:15.044302 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.044291 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:15.046716 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.046704 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:15.046766 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.046723 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:15.047229 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.047219 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:15.047276 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.047236 2573 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:15.047276 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.047257 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:15.048204 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.048191 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:15.048281 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.048213 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:15.051175 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.051157 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:15.054744 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.054705 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:15.056085 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056068 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056089 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056097 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056102 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056108 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056119 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056125 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056133 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056144 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:15.056157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056155 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:15.056409 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056176 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:15.056409 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.056190 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:15.057790 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.057772 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:15.057863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.057799 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:15.060351 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.060326 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-84.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:15.060436 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.060419 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-84.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:15.060672 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.060644 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:15.060901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.060887 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tgktm" Apr 17 17:24:15.061631 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.061619 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:15.061673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.061658 2573 server.go:1295] "Started kubelet" Apr 17 17:24:15.061786 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.061739 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:15.061840 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.061811 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:15.061967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.061938 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:15.062403 ip-10-0-139-84 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:15.062970 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.062958 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:15.063466 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.063450 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:15.066995 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.065969 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-84.ec2.internal.18a734d21d612148 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-84.ec2.internal,UID:ip-10-0-139-84.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-84.ec2.internal,},FirstTimestamp:2026-04-17 17:24:15.061631304 +0000 UTC m=+0.377713136,LastTimestamp:2026-04-17 17:24:15.061631304 +0000 UTC m=+0.377713136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-84.ec2.internal,}" Apr 17 17:24:15.068142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.068127 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:15.068555 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.068536 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069236 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069239 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069266 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.069321 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069425 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069434 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069481 2573 factory.go:55] Registering systemd factory Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069501 2573 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:15.069650 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069530 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tgktm" Apr 17 17:24:15.070125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069915 2573 factory.go:153] Registering CRI-O factory Apr 17 17:24:15.070125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069930 2573 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:15.070125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069974 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:15.070125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.069993 2573 factory.go:103] Registering Raw factory Apr 17 17:24:15.070125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.070003 2573 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:15.071151 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.071126 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:15.071333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.071309 2573 manager.go:319] Starting recovery of all containers Apr 17 17:24:15.078150 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.078131 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:15.081609 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.081583 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-84.ec2.internal\" not found" node="ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.082740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.082717 2573 manager.go:324] Recovery completed Apr 17 17:24:15.087721 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.087613 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.089913 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.089898 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.089985 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.089929 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.089985 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.089939 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.090445 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.090430 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:15.090512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.090444 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:15.090512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.090466 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:15.093157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.093145 2573 policy_none.go:49] "None policy: Start" Apr 17 17:24:15.093204 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.093162 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:15.093204 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.093172 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.130583 2573 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.130619 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.130633 2573 server.go:85] "Starting device plugin registration server" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.131007 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.131036 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.131141 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.131225 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.131236 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.131559 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:15.144416 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.131598 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.189424 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.189388 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:15.190610 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.190595 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:15.191200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.191181 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:15.191290 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.191226 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:15.191290 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.191237 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:15.191290 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.191277 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:15.193779 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.193763 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:15.232107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.232042 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.233003 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.232989 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.233093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.233035 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.233093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.233050 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.233093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.233074 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.241472 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.241452 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.241523 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.241479 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-84.ec2.internal\": node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.259257 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.259237 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.291933 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.291900 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal"] Apr 17 17:24:15.292002 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.291976 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.293116 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.293103 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.293179 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.293130 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.293179 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.293143 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.295475 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.295463 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.295572 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.295558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.295613 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.295584 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.296235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296220 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.296333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296245 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.296333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296252 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.296333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296279 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.296333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296293 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.296523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.296256 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.298582 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.298569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.298656 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.298592 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:15.299245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.299230 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:15.299318 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.299257 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:15.299318 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.299275 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:15.326613 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.326596 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-84.ec2.internal\" not found" node="ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.330933 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.330920 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-84.ec2.internal\" not found" node="ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.359437 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.359418 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.370873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.370840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.370960 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.370922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.371005 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.370971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243a441188fa30c2a72c720d3a8cc5f2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-84.ec2.internal\" (UID: \"243a441188fa30c2a72c720d3a8cc5f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.459550 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.459519 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.471870 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243a441188fa30c2a72c720d3a8cc5f2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-84.ec2.internal\" (UID: \"243a441188fa30c2a72c720d3a8cc5f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.471947 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.471947 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.471947 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.472092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243a441188fa30c2a72c720d3a8cc5f2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-84.ec2.internal\" (UID: \"243a441188fa30c2a72c720d3a8cc5f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.472092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.471952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/52ccb430eec6031f54a284dd821db8ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal\" (UID: \"52ccb430eec6031f54a284dd821db8ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.560316 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.560245 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.628747 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.628719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.633319 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.633304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:15.660907 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.660877 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.761468 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.761433 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.861934 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.861907 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.962610 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:15.962577 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:15.987027 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.986999 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:15.987511 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.987157 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:15.987511 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:15.987167 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:16.063365 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.063335 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.068517 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.068498 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:16.072849 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.072820 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:15 +0000 UTC" deadline="2027-10-18 08:46:15.851193133 +0000 UTC" Apr 17 17:24:16.072849 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.072846 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13167h21m59.778349907s" Apr 17 17:24:16.078374 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.078355 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:16.104381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.104323 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8r7bl" Apr 17 17:24:16.119186 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.119164 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8r7bl" Apr 17 17:24:16.158793 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:16.158759 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ccb430eec6031f54a284dd821db8ca.slice/crio-c4d3db7d9b3bf48b21c87b794a71acf0a593b15eda8d42c8f7279cfdcdca9841 WatchSource:0}: Error finding container c4d3db7d9b3bf48b21c87b794a71acf0a593b15eda8d42c8f7279cfdcdca9841: Status 404 returned error can't find the container with id c4d3db7d9b3bf48b21c87b794a71acf0a593b15eda8d42c8f7279cfdcdca9841 Apr 17 17:24:16.159059 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:16.159038 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243a441188fa30c2a72c720d3a8cc5f2.slice/crio-5cb131bd3f760712653330f00bb550fb14d816d6ee108db4c02a1a17acace9a4 WatchSource:0}: Error finding container 5cb131bd3f760712653330f00bb550fb14d816d6ee108db4c02a1a17acace9a4: Status 404 returned error can't find the container with id 5cb131bd3f760712653330f00bb550fb14d816d6ee108db4c02a1a17acace9a4 Apr 17 17:24:16.162238 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.162226 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:16.163859 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.163843 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.194513 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.194461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" event={"ID":"52ccb430eec6031f54a284dd821db8ca","Type":"ContainerStarted","Data":"c4d3db7d9b3bf48b21c87b794a71acf0a593b15eda8d42c8f7279cfdcdca9841"} Apr 17 17:24:16.195336 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.195318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" event={"ID":"243a441188fa30c2a72c720d3a8cc5f2","Type":"ContainerStarted","Data":"5cb131bd3f760712653330f00bb550fb14d816d6ee108db4c02a1a17acace9a4"} Apr 17 17:24:16.264535 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.264506 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.364990 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.364901 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.434500 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.434477 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:16.465655 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.465618 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.566427 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:16.566396 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-84.ec2.internal\" not found" Apr 17 17:24:16.646571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.646496 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:16.669238 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.669045 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" Apr 17 17:24:16.679394 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.679366 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:16.680215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.680194 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" Apr 17 17:24:16.689569 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:16.689546 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:17.019966 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.019896 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:17.049028 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.048993 2573 apiserver.go:52] "Watching apiserver" Apr 17 17:24:17.057216 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.057194 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:17.057567 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.057544 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xckw7","openshift-ovn-kubernetes/ovnkube-node-z92s2","kube-system/konnectivity-agent-jwn7v","kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal","openshift-image-registry/node-ca-whd7s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal","openshift-multus/multus-x8vrb","openshift-network-operator/iptables-alerter-rfdsw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2","openshift-cluster-node-tuning-operator/tuned-txc9f","openshift-dns/node-resolver-7l6j9","openshift-multus/multus-additional-cni-plugins-lsns4","openshift-multus/network-metrics-daemon-llf4n"] Apr 17 17:24:17.059979 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.059956 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.062485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.062322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.062485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.062396 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwdsh\"" Apr 17 17:24:17.062485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.062410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.062485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.062399 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.064271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.064252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r5w7h\"" Apr 17 17:24:17.064512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.064493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.065155 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.064988 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:17.065155 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.065011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:17.065155 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.064988 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.065348 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.065278 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.065348 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.065320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:17.065441 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.065359 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:17.066611 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.066591 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6w5bw\"" Apr 17 17:24:17.066742 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.066636 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:17.066813 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.066774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.066901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.066880 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:17.069187 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.068869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.069187 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.068939 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.069187 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.068946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fv27\"" Apr 17 17:24:17.069456 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.069442 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:17.069741 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.069725 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.071324 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.071270 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.071324 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.071291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42rvn\"" Apr 17 17:24:17.071324 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.071321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.071797 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.071781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:17.071994 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.071974 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:17.072086 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.072058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.074136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.074114 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sx926\"" Apr 17 17:24:17.074226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.074177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.074226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.074200 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.074226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.074213 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:17.074610 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.074593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.076872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.076764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.076872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.076787 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-c4bww\"" Apr 17 17:24:17.076872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.076798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.076872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.076764 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:17.077103 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.077035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:17.077155 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.077103 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:17.079361 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.079341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.081487 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-tmp\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.081487 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.081487 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-netd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.081676 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081653 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:17.081783 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081760 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b926f\"" Apr 17 17:24:17.081938 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.081922 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:17.082932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.082909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-run\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.082951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-netns\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.083093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-hostroot\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-iptables-alerter-script\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.083298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcpsn\" (UniqueName: \"kubernetes.io/projected/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-kube-api-access-gcpsn\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.083298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-sys\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5d28da5f-188e-4c8f-a4ec-2178fc14de55-agent-certs\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.083298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-system-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-os-release\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083490 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-conf-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083490 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5d28da5f-188e-4c8f-a4ec-2178fc14de55-konnectivity-ca\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.083490 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.083490 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-node-log\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.083622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-host-slash\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.083622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-kubelet\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.083622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083549 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-socket-dir-parent\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-kubelet\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-conf\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-tuned\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.083799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96e2a5d4-ba15-434e-9514-847bbd7fec29-host\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.083906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-daemon-config\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.083906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-modprobe-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.083906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.083889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-slash\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084239 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-host\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.084239 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6fg\" (UniqueName: \"kubernetes.io/projected/27b1f2c8-8cbd-4922-afec-eee9668dfd31-kube-api-access-kq6fg\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.084373 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2vm7\" (UniqueName: \"kubernetes.io/projected/b7181578-5b02-4803-bf4b-fbb5cec55a12-kube-api-access-f2vm7\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084373 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-var-lib-kubelet\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.084373 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-systemd-units\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084531 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-netns\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084531 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-bin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084629 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084514 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysconfig\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.084629 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-ovn\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084629 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084765 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-env-overrides\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084765 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96e2a5d4-ba15-434e-9514-847bbd7fec29-serviceca\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.084765 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-cni-binary-copy\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084890 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-multus\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084890 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-config\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.084976 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74qd\" (UniqueName: \"kubernetes.io/projected/96e2a5d4-ba15-434e-9514-847bbd7fec29-kube-api-access-w74qd\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.084976 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-cnibin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.084976 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-multus-certs\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.084993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-systemd\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-lib-modules\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-bin\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-script-lib\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-etc-kubernetes\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.085197 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:17.085236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-systemd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovn-node-metrics-cert\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-k8s-cni-cncf-io\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-kubernetes\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-var-lib-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-etc-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-log-socket\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085435 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.085585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.085474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zs2\" (UniqueName: \"kubernetes.io/projected/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-kube-api-access-t4zs2\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.087888 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.087868 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:17.088670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.088507 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z4hj5\"" Apr 17 17:24:17.088670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.088566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:17.120087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.120057 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:16 +0000 UTC" deadline="2028-01-05 06:20:43.057278154 +0000 UTC" Apr 17 17:24:17.120087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.120087 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15060h56m25.937194854s" Apr 17 17:24:17.156944 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.156919 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:17.170192 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.170172 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:17.186031 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.185986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.186031 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-node-log\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-host-slash\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-kubelet\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-socket-dir-parent\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186158 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-kubelet\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-host-slash\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-kubelet\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-kubelet\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-socket-dir-parent\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-node-log\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlm9t\" (UniqueName: \"kubernetes.io/projected/9b0827a8-3eb5-4863-995e-e708ba3f0756-kube-api-access-hlm9t\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-conf\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-tuned\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96e2a5d4-ba15-434e-9514-847bbd7fec29-host\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.186457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-daemon-config\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-conf\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-modprobe-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96e2a5d4-ba15-434e-9514-847bbd7fec29-host\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-slash\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-registration-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-host\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-modprobe-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6fg\" (UniqueName: \"kubernetes.io/projected/27b1f2c8-8cbd-4922-afec-eee9668dfd31-kube-api-access-kq6fg\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-slash\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2vm7\" (UniqueName: \"kubernetes.io/projected/b7181578-5b02-4803-bf4b-fbb5cec55a12-kube-api-access-f2vm7\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186661 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-host\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysctl-d\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-var-lib-kubelet\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-systemd-units\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186723 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-netns\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.186892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-bin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-var-lib-kubelet\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-systemd-units\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186789 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-bin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186823 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-sys-fs\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b0827a8-3eb5-4863-995e-e708ba3f0756-hosts-file\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-netns\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysconfig\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-ovn\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-env-overrides\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-sysconfig\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-ovn\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.186980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96e2a5d4-ba15-434e-9514-847bbd7fec29-serviceca\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-cni-binary-copy\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-multus\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.187710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8d6h\" (UniqueName: \"kubernetes.io/projected/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-kube-api-access-x8d6h\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-config\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-daemon-config\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w74qd\" (UniqueName: \"kubernetes.io/projected/96e2a5d4-ba15-434e-9514-847bbd7fec29-kube-api-access-w74qd\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-cnibin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-multus-certs\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fsl\" (UniqueName: \"kubernetes.io/projected/8c1b9fc8-4b21-4201-8f92-dce101b1890a-kube-api-access-z9fsl\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-var-lib-cni-multus\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-systemd\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-lib-modules\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-cnibin\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-bin\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-script-lib\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-etc-kubernetes\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-systemd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovn-node-metrics-cert\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96e2a5d4-ba15-434e-9514-847bbd7fec29-serviceca\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.188549 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-k8s-cni-cncf-io\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-env-overrides\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-bin\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7181578-5b02-4803-bf4b-fbb5cec55a12-cni-binary-copy\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-kubernetes\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187615 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-lib-modules\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-var-lib-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-k8s-cni-cncf-io\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-host-run-multus-certs\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-etc-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-etc-kubernetes\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-run-systemd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-log-socket\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-log-socket\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-var-lib-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.189375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-config\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-systemd\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zs2\" (UniqueName: \"kubernetes.io/projected/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-kube-api-access-t4zs2\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-kubernetes\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b0827a8-3eb5-4863-995e-e708ba3f0756-tmp-dir\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-tmp\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-etc-openvswitch\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.187969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovnkube-script-lib\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cnibin\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-os-release\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-netd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-device-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-cni-netd\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-run\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.190248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-netns\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-run\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-hostroot\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-host-run-netns\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-iptables-alerter-script\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-hostroot\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcpsn\" (UniqueName: \"kubernetes.io/projected/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-kube-api-access-gcpsn\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-sys\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5d28da5f-188e-4c8f-a4ec-2178fc14de55-agent-certs\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-system-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-os-release\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-conf-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5d28da5f-188e-4c8f-a4ec-2178fc14de55-konnectivity-ca\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-system-cni-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27b1f2c8-8cbd-4922-afec-eee9668dfd31-sys\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-socket-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-iptables-alerter-script\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.191072 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-os-release\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7181578-5b02-4803-bf4b-fbb5cec55a12-multus-conf-dir\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.188992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.189055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znrdv\" (UniqueName: \"kubernetes.io/projected/16768e16-66ae-4d96-97c7-c8812143f5c5-kube-api-access-znrdv\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.189148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-system-cni-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.189298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5d28da5f-188e-4c8f-a4ec-2178fc14de55-konnectivity-ca\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.190166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-etc-tuned\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.190344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27b1f2c8-8cbd-4922-afec-eee9668dfd31-tmp\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.190455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-ovn-node-metrics-cert\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.191606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.191261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5d28da5f-188e-4c8f-a4ec-2178fc14de55-agent-certs\") pod \"konnectivity-agent-jwn7v\" (UID: \"5d28da5f-188e-4c8f-a4ec-2178fc14de55\") " pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.194851 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.194826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2vm7\" (UniqueName: \"kubernetes.io/projected/b7181578-5b02-4803-bf4b-fbb5cec55a12-kube-api-access-f2vm7\") pod \"multus-x8vrb\" (UID: \"b7181578-5b02-4803-bf4b-fbb5cec55a12\") " pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.195208 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.195187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74qd\" (UniqueName: \"kubernetes.io/projected/96e2a5d4-ba15-434e-9514-847bbd7fec29-kube-api-access-w74qd\") pod \"node-ca-whd7s\" (UID: \"96e2a5d4-ba15-434e-9514-847bbd7fec29\") " pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.195273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.195241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6fg\" (UniqueName: \"kubernetes.io/projected/27b1f2c8-8cbd-4922-afec-eee9668dfd31-kube-api-access-kq6fg\") pod \"tuned-txc9f\" (UID: \"27b1f2c8-8cbd-4922-afec-eee9668dfd31\") " pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.196378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.196339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcpsn\" (UniqueName: \"kubernetes.io/projected/b2f1cf66-fa2c-4ff9-b222-0a64aebbf351-kube-api-access-gcpsn\") pod \"iptables-alerter-rfdsw\" (UID: \"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351\") " pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.197931 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.197911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zs2\" (UniqueName: \"kubernetes.io/projected/db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e-kube-api-access-t4zs2\") pod \"ovnkube-node-z92s2\" (UID: \"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.289978 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.289887 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-registration-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.289978 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.289935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-sys-fs\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.289978 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.289955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b0827a8-3eb5-4863-995e-e708ba3f0756-hosts-file\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.289980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8d6h\" (UniqueName: \"kubernetes.io/projected/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-kube-api-access-x8d6h\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fsl\" (UniqueName: \"kubernetes.io/projected/8c1b9fc8-4b21-4201-8f92-dce101b1890a-kube-api-access-z9fsl\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-sys-fs\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b0827a8-3eb5-4863-995e-e708ba3f0756-tmp-dir\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cnibin\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-os-release\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-device-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-socket-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znrdv\" (UniqueName: \"kubernetes.io/projected/16768e16-66ae-4d96-97c7-c8812143f5c5-kube-api-access-znrdv\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-system-cni-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlm9t\" (UniqueName: \"kubernetes.io/projected/9b0827a8-3eb5-4863-995e-e708ba3f0756-kube-api-access-hlm9t\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b0827a8-3eb5-4863-995e-e708ba3f0756-hosts-file\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.290527 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-socket-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.290611 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:17.790578949 +0000 UTC m=+3.106660792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.290822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-registration-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290836 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-os-release\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cnibin\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.290947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-system-cni-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1b9fc8-4b21-4201-8f92-dce101b1890a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b0827a8-3eb5-4863-995e-e708ba3f0756-tmp-dir\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/16768e16-66ae-4d96-97c7-c8812143f5c5-device-dir\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.291368 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.291724 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.291699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c1b9fc8-4b21-4201-8f92-dce101b1890a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.297784 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.297762 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:17.297784 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.297784 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:17.297968 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.297796 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:17.297968 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.297863 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:17.797846822 +0000 UTC m=+3.113928660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:17.299922 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.299875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fsl\" (UniqueName: \"kubernetes.io/projected/8c1b9fc8-4b21-4201-8f92-dce101b1890a-kube-api-access-z9fsl\") pod \"multus-additional-cni-plugins-lsns4\" (UID: \"8c1b9fc8-4b21-4201-8f92-dce101b1890a\") " pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.300046 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.300001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8d6h\" (UniqueName: \"kubernetes.io/projected/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-kube-api-access-x8d6h\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.300138 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.300116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znrdv\" (UniqueName: \"kubernetes.io/projected/16768e16-66ae-4d96-97c7-c8812143f5c5-kube-api-access-znrdv\") pod \"aws-ebs-csi-driver-node-2h2v2\" (UID: \"16768e16-66ae-4d96-97c7-c8812143f5c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.300196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.300153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlm9t\" (UniqueName: \"kubernetes.io/projected/9b0827a8-3eb5-4863-995e-e708ba3f0756-kube-api-access-hlm9t\") pod \"node-resolver-7l6j9\" (UID: \"9b0827a8-3eb5-4863-995e-e708ba3f0756\") " pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.371166 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.371131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-txc9f" Apr 17 17:24:17.378959 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.378943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:17.389527 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.389505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:17.394644 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.394619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whd7s" Apr 17 17:24:17.402176 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.402159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x8vrb" Apr 17 17:24:17.407681 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.407660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfdsw" Apr 17 17:24:17.413356 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.413334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" Apr 17 17:24:17.419867 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.419850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l6j9" Apr 17 17:24:17.425419 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.425399 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lsns4" Apr 17 17:24:17.749496 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.749471 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16768e16_66ae_4d96_97c7_c8812143f5c5.slice/crio-47fb3ad660102fe7b59c05491296000fbdf080a87299fc4b8823cbe50120ca79 WatchSource:0}: Error finding container 47fb3ad660102fe7b59c05491296000fbdf080a87299fc4b8823cbe50120ca79: Status 404 returned error can't find the container with id 47fb3ad660102fe7b59c05491296000fbdf080a87299fc4b8823cbe50120ca79 Apr 17 17:24:17.750942 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.750867 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e2a5d4_ba15_434e_9514_847bbd7fec29.slice/crio-b7817dbac459b08ad307e26f88e95d3af9239e4373fc57d7f9f6ba076af7aca1 WatchSource:0}: Error finding container b7817dbac459b08ad307e26f88e95d3af9239e4373fc57d7f9f6ba076af7aca1: Status 404 returned error can't find the container with id b7817dbac459b08ad307e26f88e95d3af9239e4373fc57d7f9f6ba076af7aca1 Apr 17 17:24:17.752917 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.752896 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f1cf66_fa2c_4ff9_b222_0a64aebbf351.slice/crio-e17eb36904748093c68e44780aa8efd8ab84e0946e5df65d39c8ff8bc7392ff0 WatchSource:0}: Error finding container e17eb36904748093c68e44780aa8efd8ab84e0946e5df65d39c8ff8bc7392ff0: Status 404 returned error can't find the container with id e17eb36904748093c68e44780aa8efd8ab84e0946e5df65d39c8ff8bc7392ff0 Apr 17 17:24:17.753422 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.753394 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1b9fc8_4b21_4201_8f92_dce101b1890a.slice/crio-c7b4723464222d1690a7c612bf90aca1bcc3b74a54ec0b0c4100c9e78fd18135 WatchSource:0}: Error finding container c7b4723464222d1690a7c612bf90aca1bcc3b74a54ec0b0c4100c9e78fd18135: Status 404 returned error can't find the container with id c7b4723464222d1690a7c612bf90aca1bcc3b74a54ec0b0c4100c9e78fd18135 Apr 17 17:24:17.756109 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.756072 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0827a8_3eb5_4863_995e_e708ba3f0756.slice/crio-1f7a8621316ae235ce28cdcc249b4d5d88cd2fb76a32aed979d506b601cfccf7 WatchSource:0}: Error finding container 1f7a8621316ae235ce28cdcc249b4d5d88cd2fb76a32aed979d506b601cfccf7: Status 404 returned error can't find the container with id 1f7a8621316ae235ce28cdcc249b4d5d88cd2fb76a32aed979d506b601cfccf7 Apr 17 17:24:17.757082 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.757051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b1f2c8_8cbd_4922_afec_eee9668dfd31.slice/crio-a652af37da41e24673218f404e2a13f7bae5edf2150f1542dea12fb4eb55279f WatchSource:0}: Error finding container a652af37da41e24673218f404e2a13f7bae5edf2150f1542dea12fb4eb55279f: Status 404 returned error can't find the container with id a652af37da41e24673218f404e2a13f7bae5edf2150f1542dea12fb4eb55279f Apr 17 17:24:17.758078 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.758004 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb91ff1e_c6f6_45a8_9a4b_3a45c661fd7e.slice/crio-de067a3eb09e6731b83ac6deb1c154251d56b032982b9786cbf25f38096b0b10 WatchSource:0}: Error finding container de067a3eb09e6731b83ac6deb1c154251d56b032982b9786cbf25f38096b0b10: Status 404 returned error can't find the container with id de067a3eb09e6731b83ac6deb1c154251d56b032982b9786cbf25f38096b0b10 Apr 17 17:24:17.759129 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.759107 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7181578_5b02_4803_bf4b_fbb5cec55a12.slice/crio-f3a99a10b1f5f2e10c9552d04c5df30f683eba8b93c348b42423c088c6970418 WatchSource:0}: Error finding container f3a99a10b1f5f2e10c9552d04c5df30f683eba8b93c348b42423c088c6970418: Status 404 returned error can't find the container with id f3a99a10b1f5f2e10c9552d04c5df30f683eba8b93c348b42423c088c6970418 Apr 17 17:24:17.761574 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:17.761179 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d28da5f_188e_4c8f_a4ec_2178fc14de55.slice/crio-53ae411406d043b8c4ba300cc611bba6c70936b64fef5ef5a3150bf3a3dd2021 WatchSource:0}: Error finding container 53ae411406d043b8c4ba300cc611bba6c70936b64fef5ef5a3150bf3a3dd2021: Status 404 returned error can't find the container with id 53ae411406d043b8c4ba300cc611bba6c70936b64fef5ef5a3150bf3a3dd2021 Apr 17 17:24:17.795052 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.795031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:17.795140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.795108 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:17.795179 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.795161 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:18.795148176 +0000 UTC m=+4.111230000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:17.896256 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:17.896230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:17.896375 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.896344 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:17.896375 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.896356 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:17.896375 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.896365 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:17.896480 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:17.896407 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:18.896393439 +0000 UTC m=+4.212475259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:18.120761 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.120629 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:16 +0000 UTC" deadline="2028-01-10 10:07:05.749992218 +0000 UTC" Apr 17 17:24:18.120761 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.120668 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15184h42m47.629328342s" Apr 17 17:24:18.191907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.191450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:18.191907 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.191576 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:18.206459 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.206394 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-txc9f" event={"ID":"27b1f2c8-8cbd-4922-afec-eee9668dfd31","Type":"ContainerStarted","Data":"a652af37da41e24673218f404e2a13f7bae5edf2150f1542dea12fb4eb55279f"} Apr 17 17:24:18.210872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.210812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l6j9" event={"ID":"9b0827a8-3eb5-4863-995e-e708ba3f0756","Type":"ContainerStarted","Data":"1f7a8621316ae235ce28cdcc249b4d5d88cd2fb76a32aed979d506b601cfccf7"} Apr 17 17:24:18.212929 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.212867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfdsw" event={"ID":"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351","Type":"ContainerStarted","Data":"e17eb36904748093c68e44780aa8efd8ab84e0946e5df65d39c8ff8bc7392ff0"} Apr 17 17:24:18.221074 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.221048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" event={"ID":"16768e16-66ae-4d96-97c7-c8812143f5c5","Type":"ContainerStarted","Data":"47fb3ad660102fe7b59c05491296000fbdf080a87299fc4b8823cbe50120ca79"} Apr 17 17:24:18.227962 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.227934 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" event={"ID":"243a441188fa30c2a72c720d3a8cc5f2","Type":"ContainerStarted","Data":"94905e32333785d67ece2e46481d06a6cfa49a4ee37d9cb3d3da9aa5f360378a"} Apr 17 17:24:18.232477 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.232318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jwn7v" event={"ID":"5d28da5f-188e-4c8f-a4ec-2178fc14de55","Type":"ContainerStarted","Data":"53ae411406d043b8c4ba300cc611bba6c70936b64fef5ef5a3150bf3a3dd2021"} Apr 17 17:24:18.239321 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.239289 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerStarted","Data":"c7b4723464222d1690a7c612bf90aca1bcc3b74a54ec0b0c4100c9e78fd18135"} Apr 17 17:24:18.244279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.244232 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-84.ec2.internal" podStartSLOduration=2.244216468 podStartE2EDuration="2.244216468s" podCreationTimestamp="2026-04-17 17:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:18.243233682 +0000 UTC m=+3.559315526" watchObservedRunningTime="2026-04-17 17:24:18.244216468 +0000 UTC m=+3.560298312" Apr 17 17:24:18.248156 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.248132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whd7s" event={"ID":"96e2a5d4-ba15-434e-9514-847bbd7fec29","Type":"ContainerStarted","Data":"b7817dbac459b08ad307e26f88e95d3af9239e4373fc57d7f9f6ba076af7aca1"} Apr 17 17:24:18.252283 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.252242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x8vrb" event={"ID":"b7181578-5b02-4803-bf4b-fbb5cec55a12","Type":"ContainerStarted","Data":"f3a99a10b1f5f2e10c9552d04c5df30f683eba8b93c348b42423c088c6970418"} Apr 17 17:24:18.258571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.258547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"de067a3eb09e6731b83ac6deb1c154251d56b032982b9786cbf25f38096b0b10"} Apr 17 17:24:18.804917 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.804154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:18.804917 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.804330 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:18.804917 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.804400 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:20.804379576 +0000 UTC m=+6.120461402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:18.905480 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:18.905369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:18.905653 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.905614 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:18.905716 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.905654 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:18.905716 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.905669 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:18.905826 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:18.905737 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:20.90571797 +0000 UTC m=+6.221799810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:19.194299 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:19.193547 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:19.194299 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:19.193769 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:19.273137 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:19.272836 2573 generic.go:358] "Generic (PLEG): container finished" podID="52ccb430eec6031f54a284dd821db8ca" containerID="0fc43058d5366e5e9402e08df207ee6ad48479e349bba0e617c7cd1ba5fd5e69" exitCode=0 Apr 17 17:24:19.273137 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:19.273068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" event={"ID":"52ccb430eec6031f54a284dd821db8ca","Type":"ContainerDied","Data":"0fc43058d5366e5e9402e08df207ee6ad48479e349bba0e617c7cd1ba5fd5e69"} Apr 17 17:24:20.179540 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.179452 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tkbg5"] Apr 17 17:24:20.182269 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.182242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.182375 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.182322 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:20.191622 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.191597 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:20.191738 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.191705 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:20.214951 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.214920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-dbus\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.215354 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.214985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.215354 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.215070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-kubelet-config\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.285067 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.285034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" event={"ID":"52ccb430eec6031f54a284dd821db8ca","Type":"ContainerStarted","Data":"32838dcb781b38f702093999ea720e34188a8b2d206eb1f7715c2211422d8d96"} Apr 17 17:24:20.315563 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.315532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-dbus\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.315720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.315594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.315720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.315648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-kubelet-config\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.315720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.315716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-kubelet-config\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.315879 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.315863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a32594c3-cfcc-4c83-88d7-263f8c57a927-dbus\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.315988 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.315972 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:20.316073 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.316064 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:20.816043431 +0000 UTC m=+6.132125253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:20.821302 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.821039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:20.821302 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.821094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:20.821302 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.821249 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:20.821302 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.821312 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:24.821292634 +0000 UTC m=+10.137374469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:20.821631 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.821251 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:20.821685 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.821655 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:21.821634747 +0000 UTC m=+7.137716571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:20.921799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:20.921760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:20.921963 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.921922 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:20.921963 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.921942 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:20.921963 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.921956 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:20.922152 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:20.922006 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:24.921993069 +0000 UTC m=+10.238074889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:21.192209 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:21.192135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:21.192353 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:21.192279 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:21.832369 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:21.832326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:21.832826 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:21.832528 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:21.832826 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:21.832598 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.832577566 +0000 UTC m=+9.148659388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:22.192905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:22.192362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:22.192905 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:22.192511 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:22.192905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:22.192545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:22.192905 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:22.192647 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:23.191696 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:23.191618 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:23.192167 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:23.191765 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:23.847325 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:23.847288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:23.847492 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:23.847452 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:23.847552 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:23.847510 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.847493569 +0000 UTC m=+13.163575403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:24.192291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:24.192210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:24.192733 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:24.192224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:24.192733 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.192345 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:24.192733 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.192373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:24.856252 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:24.855779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:24.856252 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.855897 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:24.856252 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.855947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:32.85593384 +0000 UTC m=+18.172015661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:24.957685 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:24.957122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:24.957685 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.957273 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:24.957685 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.957293 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:24.957685 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.957305 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:24.957685 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:24.957358 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:32.957339337 +0000 UTC m=+18.273421163 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:25.192719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:25.192345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:25.192719 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:25.192477 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:26.191554 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:26.191527 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:26.191785 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:26.191531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:26.191785 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:26.191640 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:26.191785 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:26.191711 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:27.192363 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:27.192279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:27.192762 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:27.192407 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:27.882514 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:27.882473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:27.882687 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:27.882616 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:27.882773 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:27.882696 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:35.882673974 +0000 UTC m=+21.198755807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:28.192132 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:28.192067 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:28.192132 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:28.192108 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:28.192319 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:28.192178 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:28.192319 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:28.192244 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:29.191680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:29.191652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:29.192140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:29.191775 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:30.191911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:30.191883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:30.192270 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:30.191994 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:30.192270 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:30.192035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:30.192270 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:30.192100 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:31.192290 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:31.192252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:31.192750 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:31.192403 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:32.192289 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:32.192251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:32.192518 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:32.192251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:32.192518 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:32.192374 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:32.192518 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:32.192484 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:32.921415 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:32.921381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:32.921624 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:32.921540 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:32.921624 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:32.921613 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.921594404 +0000 UTC m=+34.237676225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:33.022554 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:33.022517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:33.022717 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:33.022677 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:33.022717 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:33.022697 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:33.022717 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:33.022707 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f9hb6 for pod openshift-network-diagnostics/network-check-target-xckw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:33.022839 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:33.022757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6 podName:58b27aad-7dee-48c9-bbb0-67164a5931c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:49.022743505 +0000 UTC m=+34.338825325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f9hb6" (UniqueName: "kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6") pod "network-check-target-xckw7" (UID: "58b27aad-7dee-48c9-bbb0-67164a5931c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:33.192530 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:33.192443 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:33.192884 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:33.192574 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:34.191892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:34.191860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:34.192076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:34.191860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:34.192076 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:34.191991 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:34.192202 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:34.192079 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:35.192827 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.192415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:35.193588 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:35.192944 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:35.312705 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.312666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x8vrb" event={"ID":"b7181578-5b02-4803-bf4b-fbb5cec55a12","Type":"ContainerStarted","Data":"96fd91fbe6cfdf97f564c3ea07b1e9aa32f24cf65c11c27f289124b8bbaab2b3"} Apr 17 17:24:35.315879 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"6e17225e67ca7fbe6aa3afe6e1b3f82bbfd98890ece05d42470ec6d2ab32e339"} Apr 17 17:24:35.315990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"c3d02a2621ee39010c70b9e0f9f0cb7436c7c3db86db7d71796ff7ac3fbb7e9c"} Apr 17 17:24:35.315990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"b8430818f318dfc2bdecfe885ee045650e249ba941479cc706774779034841c2"} Apr 17 17:24:35.315990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"f7e2917a24e6e888305d88397e4db60f5e3f33847cf34eb3d5b68d17e95faa55"} Apr 17 17:24:35.315990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315926 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"dc18729332da25425b5105f30b408121ddde8456321eb602cbc97a187fd5f7b0"} Apr 17 17:24:35.315990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.315939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"dca6bf093b951539cae67052ee762b3ba3cc619608352120e8747d0eb58379ff"} Apr 17 17:24:35.317594 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.317564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-txc9f" event={"ID":"27b1f2c8-8cbd-4922-afec-eee9668dfd31","Type":"ContainerStarted","Data":"35ce25f1af2c0bae7b359283673cbd24d72b3f29137cc8483f656c55f238f11b"} Apr 17 17:24:35.320955 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.320931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l6j9" event={"ID":"9b0827a8-3eb5-4863-995e-e708ba3f0756","Type":"ContainerStarted","Data":"779e037cb44ed39c5e21eeb33a2d3f42fa3df4bffd5942b63dd83779c10a843f"} Apr 17 17:24:35.323645 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.323621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" event={"ID":"16768e16-66ae-4d96-97c7-c8812143f5c5","Type":"ContainerStarted","Data":"a59832adadbf2c4f3c1efeb5e50040e57a46378395e87397ab3c1546993d22d6"} Apr 17 17:24:35.325027 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.324989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jwn7v" event={"ID":"5d28da5f-188e-4c8f-a4ec-2178fc14de55","Type":"ContainerStarted","Data":"b7c39efe311c963d7888d11328da86901bf66839f2c822284414cf4f2d5be7c9"} Apr 17 17:24:35.326375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.326352 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="efead3e951e095c182f141d0df0f476f74a3264ecf055c2f79769bee591c70ae" exitCode=0 Apr 17 17:24:35.326480 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.326423 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"efead3e951e095c182f141d0df0f476f74a3264ecf055c2f79769bee591c70ae"} Apr 17 17:24:35.328001 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.327887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whd7s" event={"ID":"96e2a5d4-ba15-434e-9514-847bbd7fec29","Type":"ContainerStarted","Data":"d49741ef1ac2c31afd0da3114f3e1e9e34ff5e5430e71101d9285886ed3b879e"} Apr 17 17:24:35.346764 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.346725 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-84.ec2.internal" podStartSLOduration=19.346710818 podStartE2EDuration="19.346710818s" podCreationTimestamp="2026-04-17 17:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:20.306116255 +0000 UTC m=+5.622198098" watchObservedRunningTime="2026-04-17 17:24:35.346710818 +0000 UTC m=+20.662792662" Apr 17 17:24:35.346867 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.346813 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x8vrb" podStartSLOduration=3.143555906 podStartE2EDuration="20.346804203s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.762099686 +0000 UTC m=+3.078181506" lastFinishedPulling="2026-04-17 17:24:34.965347979 +0000 UTC m=+20.281429803" observedRunningTime="2026-04-17 17:24:35.346557186 +0000 UTC m=+20.662639058" watchObservedRunningTime="2026-04-17 17:24:35.346804203 +0000 UTC m=+20.662886045" Apr 17 17:24:35.371214 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.371169 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-txc9f" podStartSLOduration=3.44167335 podStartE2EDuration="20.371157559s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.76144594 +0000 UTC m=+3.077527763" lastFinishedPulling="2026-04-17 17:24:34.690930137 +0000 UTC m=+20.007011972" observedRunningTime="2026-04-17 17:24:35.3696645 +0000 UTC m=+20.685746342" watchObservedRunningTime="2026-04-17 17:24:35.371157559 +0000 UTC m=+20.687239402" Apr 17 17:24:35.395199 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.395160 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jwn7v" podStartSLOduration=3.467349783 podStartE2EDuration="20.395147814s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.763198481 +0000 UTC m=+3.079280302" lastFinishedPulling="2026-04-17 17:24:34.690996504 +0000 UTC m=+20.007078333" observedRunningTime="2026-04-17 17:24:35.394863667 +0000 UTC m=+20.710945519" watchObservedRunningTime="2026-04-17 17:24:35.395147814 +0000 UTC m=+20.711229656" Apr 17 17:24:35.429928 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.429887 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-whd7s" podStartSLOduration=3.491365464 podStartE2EDuration="20.429875679s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.752433934 +0000 UTC m=+3.068515754" lastFinishedPulling="2026-04-17 17:24:34.690944149 +0000 UTC m=+20.007025969" observedRunningTime="2026-04-17 17:24:35.429803345 +0000 UTC m=+20.745885188" watchObservedRunningTime="2026-04-17 17:24:35.429875679 +0000 UTC m=+20.745957522" Apr 17 17:24:35.475123 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.475044 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7l6j9" podStartSLOduration=3.5429494310000003 podStartE2EDuration="20.475006556s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.759138319 +0000 UTC m=+3.075220139" lastFinishedPulling="2026-04-17 17:24:34.69119543 +0000 UTC m=+20.007277264" observedRunningTime="2026-04-17 17:24:35.47438605 +0000 UTC m=+20.790467891" watchObservedRunningTime="2026-04-17 17:24:35.475006556 +0000 UTC m=+20.791088399" Apr 17 17:24:35.817220 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.817199 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:35.950411 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:35.950373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:35.950588 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:35.950492 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:35.950588 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:35.950549 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret podName:a32594c3-cfcc-4c83-88d7-263f8c57a927 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.950532532 +0000 UTC m=+37.266614361 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret") pod "global-pull-secret-syncer-tkbg5" (UID: "a32594c3-cfcc-4c83-88d7-263f8c57a927") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:36.141808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.141656 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:35.81721591Z","UUID":"d66859c0-2cd1-4d91-bc31-ddbf8a06abc4","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:36.143411 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.143388 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:36.143543 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.143421 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:36.192125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.192098 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:36.192257 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.192098 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:36.192310 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:36.192200 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:36.192374 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:36.192300 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:36.332055 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.331949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfdsw" event={"ID":"b2f1cf66-fa2c-4ff9-b222-0a64aebbf351","Type":"ContainerStarted","Data":"8c0d415875bdf67c8c45a700b35f2d0ea6a74daf8dcef34db599b73eab5a9d33"} Apr 17 17:24:36.334337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.334281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" event={"ID":"16768e16-66ae-4d96-97c7-c8812143f5c5","Type":"ContainerStarted","Data":"1af2567d1b506f9d2bc970f9c1fe5e52c31c69d0e6c70806bd34dc75fc6f4db8"} Apr 17 17:24:36.347547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.347511 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rfdsw" podStartSLOduration=4.4110967 podStartE2EDuration="21.347498365s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.75480085 +0000 UTC m=+3.070882673" lastFinishedPulling="2026-04-17 17:24:34.691202512 +0000 UTC m=+20.007284338" observedRunningTime="2026-04-17 17:24:36.347273337 +0000 UTC m=+21.663355182" watchObservedRunningTime="2026-04-17 17:24:36.347498365 +0000 UTC m=+21.663580525" Apr 17 17:24:36.586854 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.586755 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:36.587559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:36.587533 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:37.191625 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:37.191592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:37.191898 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:37.191729 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:37.337901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:37.337839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" event={"ID":"16768e16-66ae-4d96-97c7-c8812143f5c5","Type":"ContainerStarted","Data":"0371dad30801db77cbba48a6b643fc79bbd8fdfb77a1847b3dbd4234e1594e35"} Apr 17 17:24:37.338440 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:37.338421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:37.338963 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:37.338934 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jwn7v" Apr 17 17:24:37.356690 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:37.356652 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h2v2" podStartSLOduration=3.437608785 podStartE2EDuration="22.356639195s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.751179479 +0000 UTC m=+3.067261299" lastFinishedPulling="2026-04-17 17:24:36.670209874 +0000 UTC m=+21.986291709" observedRunningTime="2026-04-17 17:24:37.356133435 +0000 UTC m=+22.672215277" watchObservedRunningTime="2026-04-17 17:24:37.356639195 +0000 UTC m=+22.672721034" Apr 17 17:24:38.191819 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:38.191588 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:38.192033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:38.191611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:38.192033 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:38.191932 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:38.192166 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:38.192037 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:38.342674 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:38.342629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"4ab94e09ef8c5c9d1b48ee9bba67d007e8238376bab5f42264fe0a937b45e3d4"} Apr 17 17:24:39.191668 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:39.191633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:39.191898 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:39.191765 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:40.192042 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.191958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:40.192682 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.191957 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:40.192682 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:40.192065 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:40.192682 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:40.192158 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:40.347633 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.347603 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="e7985f3657a9162d0930537c14d5380497c1e62851b7e83fb2ef400c63503c97" exitCode=0 Apr 17 17:24:40.347776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.347674 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"e7985f3657a9162d0930537c14d5380497c1e62851b7e83fb2ef400c63503c97"} Apr 17 17:24:40.350780 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.350754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" event={"ID":"db91ff1e-c6f6-45a8-9a4b-3a45c661fd7e","Type":"ContainerStarted","Data":"0216545e01d048e2ff6a4e387fa04ca17a1797599f9edbff2ff2b8151d252ee8"} Apr 17 17:24:40.351079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.351058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:40.351188 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.351174 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:40.351256 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.351192 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:40.365820 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.365802 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:40.365909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.365863 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:24:40.394630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:40.394592 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" podStartSLOduration=8.399843475 podStartE2EDuration="25.394580867s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.76223477 +0000 UTC m=+3.078316595" lastFinishedPulling="2026-04-17 17:24:34.756972148 +0000 UTC m=+20.073053987" observedRunningTime="2026-04-17 17:24:40.393731275 +0000 UTC m=+25.709813116" watchObservedRunningTime="2026-04-17 17:24:40.394580867 +0000 UTC m=+25.710662708" Apr 17 17:24:41.195789 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.195763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:41.196101 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:41.195894 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:41.353850 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.353769 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="4f8de89a3d3b4fdb1139581c9d99b0624f32d6347a765fcfbd852288b3e7300e" exitCode=0 Apr 17 17:24:41.353993 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.353858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"4f8de89a3d3b4fdb1139581c9d99b0624f32d6347a765fcfbd852288b3e7300e"} Apr 17 17:24:41.416912 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.416884 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tkbg5"] Apr 17 17:24:41.417037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.416963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:41.417094 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:41.417052 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:41.423238 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.423211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llf4n"] Apr 17 17:24:41.423474 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.423451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:41.423564 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:41.423542 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:41.424287 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.424263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xckw7"] Apr 17 17:24:41.424371 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:41.424352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:41.424451 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:41.424425 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:42.357236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:42.357202 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="469d079aca9c7d42c66987a08f524a697ea2f2ff6bad5a2476eed2b0223f8007" exitCode=0 Apr 17 17:24:42.357576 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:42.357282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"469d079aca9c7d42c66987a08f524a697ea2f2ff6bad5a2476eed2b0223f8007"} Apr 17 17:24:43.195285 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:43.195255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:43.195503 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:43.195262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:43.195503 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:43.195346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:43.195503 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:43.195443 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:43.195503 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:43.195472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:43.195740 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:43.195555 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:45.192591 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:45.192560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:45.193342 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:45.192672 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llf4n" podUID="8a06ed33-b68b-4c09-88ed-8a7af0e52ef4" Apr 17 17:24:45.193342 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:45.193074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:45.193342 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:45.193172 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:45.193342 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:45.193193 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xckw7" podUID="58b27aad-7dee-48c9-bbb0-67164a5931c5" Apr 17 17:24:45.193342 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:45.193242 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tkbg5" podUID="a32594c3-cfcc-4c83-88d7-263f8c57a927" Apr 17 17:24:46.516291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.516214 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-84.ec2.internal" event="NodeReady" Apr 17 17:24:46.516659 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.516327 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:46.549482 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.549453 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:24:46.587503 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.587478 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-77nfq"] Apr 17 17:24:46.587668 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.587647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.590306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.590267 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:24:46.590438 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.590419 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vv5sf\"" Apr 17 17:24:46.590504 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.590273 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:24:46.590564 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.590546 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:24:46.597515 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.597494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:24:46.611551 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.611523 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l4qxr"] Apr 17 17:24:46.611706 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.611690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.613867 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.613839 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:46.613969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.613912 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:46.614027 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.613913 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cl7ph\"" Apr 17 17:24:46.627289 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.627271 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:24:46.627374 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.627298 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77nfq"] Apr 17 17:24:46.627374 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.627309 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l4qxr"] Apr 17 17:24:46.627498 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.627393 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.629477 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.629458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:46.629565 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.629554 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:46.629623 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.629480 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rb846\"" Apr 17 17:24:46.629671 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.629504 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:46.735638 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8t4l\" (UniqueName: \"kubernetes.io/projected/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-kube-api-access-s8t4l\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.735787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.735787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.735787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nfr\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.735932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.735932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.735932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.735932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7bv\" (UniqueName: \"kubernetes.io/projected/85412bc5-20cb-438b-9637-8f85717abf24-kube-api-access-qh7bv\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.735932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85412bc5-20cb-438b-9637-8f85717abf24-tmp-dir\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.736170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.736170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85412bc5-20cb-438b-9637-8f85717abf24-config-volume\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.736170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.735980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.736170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.736003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.736170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.736040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.836923 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.836846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85412bc5-20cb-438b-9637-8f85717abf24-tmp-dir\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.836923 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.836891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85412bc5-20cb-438b-9637-8f85717abf24-config-volume\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.837142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8t4l\" (UniqueName: \"kubernetes.io/projected/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-kube-api-access-s8t4l\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.837292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.837292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85412bc5-20cb-438b-9637-8f85717abf24-tmp-dir\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837262 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837325 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837394 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.337372891 +0000 UTC m=+32.653454740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837407 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837451 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.337440317 +0000 UTC m=+32.653522142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:24:46.837497 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87nfr\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85412bc5-20cb-438b-9637-8f85717abf24-config-volume\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.837610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7bv\" (UniqueName: \"kubernetes.io/projected/85412bc5-20cb-438b-9637-8f85717abf24-kube-api-access-qh7bv\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837633 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:46.837818 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:46.837665 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.337653092 +0000 UTC m=+32.653734926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:24:46.838183 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.838119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.838183 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.838161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.842335 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.842314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.842441 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.842319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.846198 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.846176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7bv\" (UniqueName: \"kubernetes.io/projected/85412bc5-20cb-438b-9637-8f85717abf24-kube-api-access-qh7bv\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:46.846643 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.846618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8t4l\" (UniqueName: \"kubernetes.io/projected/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-kube-api-access-s8t4l\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:46.846810 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.846790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nfr\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:46.847123 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:46.847085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:47.192434 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.192400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:47.192616 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.192400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:47.192616 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.192411 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:47.195719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:47.195719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195523 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m76bp\"" Apr 17 17:24:47.195719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195529 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:24:47.195719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195568 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:47.195719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195584 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:24:47.196071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.195858 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:47.341528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.341497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:47.341714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.341551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:47.341714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:47.341598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:47.341714 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341659 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:47.341714 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341702 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:47.341914 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341711 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:47.341914 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341730 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:24:47.341914 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341751 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.341735621 +0000 UTC m=+33.657817441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:24:47.341914 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341769 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.341760222 +0000 UTC m=+33.657842043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:24:47.341914 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:47.341793 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.341775943 +0000 UTC m=+33.657857764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:24:48.349719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:48.349694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:48.349746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:48.349780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349815 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349871 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349879 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349892 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349895 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:24:50.349866109 +0000 UTC m=+35.665947929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349915 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:50.349905762 +0000 UTC m=+35.665987585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:24:48.350140 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.349929 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:50.349922207 +0000 UTC m=+35.666004027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:24:48.954139 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:48.954109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:24:48.954322 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.954253 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:24:48.954322 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:48.954316 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs podName:8a06ed33-b68b-4c09-88ed-8a7af0e52ef4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:20.954301133 +0000 UTC m=+66.270382952 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs") pod "network-metrics-daemon-llf4n" (UID: "8a06ed33-b68b-4c09-88ed-8a7af0e52ef4") : secret "metrics-daemon-secret" not found Apr 17 17:24:49.054426 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.054401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:49.056848 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.056831 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hb6\" (UniqueName: \"kubernetes.io/projected/58b27aad-7dee-48c9-bbb0-67164a5931c5-kube-api-access-f9hb6\") pod \"network-check-target-xckw7\" (UID: \"58b27aad-7dee-48c9-bbb0-67164a5931c5\") " pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:49.310740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.310686 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:49.373729 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.373572 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="66e278277ca6f7a2888141c735b98f0a2f3e5866f62bbbfe4ce42f3866fee2f3" exitCode=0 Apr 17 17:24:49.373729 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.373646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"66e278277ca6f7a2888141c735b98f0a2f3e5866f62bbbfe4ce42f3866fee2f3"} Apr 17 17:24:49.477395 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:49.477363 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xckw7"] Apr 17 17:24:49.483526 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:49.483501 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b27aad_7dee_48c9_bbb0_67164a5931c5.slice/crio-3e3538d1850baf5f30d4c16e65a6b08e2712e8117fc17d9ddd3593b967ee5970 WatchSource:0}: Error finding container 3e3538d1850baf5f30d4c16e65a6b08e2712e8117fc17d9ddd3593b967ee5970: Status 404 returned error can't find the container with id 3e3538d1850baf5f30d4c16e65a6b08e2712e8117fc17d9ddd3593b967ee5970 Apr 17 17:24:50.363998 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.363959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:50.364241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.364106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:50.364241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.364167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:50.364344 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364314 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:50.364395 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364377 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.364357335 +0000 UTC m=+39.680439174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:24:50.364679 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364655 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:50.364849 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364661 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:50.364849 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364747 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:24:50.364849 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364734 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.364718152 +0000 UTC m=+39.680799992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:24:50.364849 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:50.364790 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.364781656 +0000 UTC m=+39.680863476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:24:50.376318 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.376296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xckw7" event={"ID":"58b27aad-7dee-48c9-bbb0-67164a5931c5","Type":"ContainerStarted","Data":"3e3538d1850baf5f30d4c16e65a6b08e2712e8117fc17d9ddd3593b967ee5970"} Apr 17 17:24:50.379112 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.379087 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c1b9fc8-4b21-4201-8f92-dce101b1890a" containerID="d0b1b8e6ad868fee31ffbac6771489b674c4a6e150037ab5cfab2a60305307c2" exitCode=0 Apr 17 17:24:50.379241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:50.379124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerDied","Data":"d0b1b8e6ad868fee31ffbac6771489b674c4a6e150037ab5cfab2a60305307c2"} Apr 17 17:24:51.385790 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:51.385750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lsns4" event={"ID":"8c1b9fc8-4b21-4201-8f92-dce101b1890a","Type":"ContainerStarted","Data":"6cea1497df92f6b8c26df88b7aa4a8d56aec895cb066e7150f4254dd367afbf5"} Apr 17 17:24:51.411189 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:51.411135 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lsns4" podStartSLOduration=5.910169348 podStartE2EDuration="36.411117656s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:17.755374575 +0000 UTC m=+3.071456396" lastFinishedPulling="2026-04-17 17:24:48.256322872 +0000 UTC m=+33.572404704" observedRunningTime="2026-04-17 17:24:51.409967517 +0000 UTC m=+36.726049361" watchObservedRunningTime="2026-04-17 17:24:51.411117656 +0000 UTC m=+36.727199500" Apr 17 17:24:51.976193 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:51.976153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:51.980843 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:51.980814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a32594c3-cfcc-4c83-88d7-263f8c57a927-original-pull-secret\") pod \"global-pull-secret-syncer-tkbg5\" (UID: \"a32594c3-cfcc-4c83-88d7-263f8c57a927\") " pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:52.003743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:52.003719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tkbg5" Apr 17 17:24:52.387744 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:52.387718 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tkbg5"] Apr 17 17:24:52.399417 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:24:52.399385 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32594c3_cfcc_4c83_88d7_263f8c57a927.slice/crio-f1ad7ea1a0b21e6aeb1df462ce247b1787d2a052985705573739a1a1df314112 WatchSource:0}: Error finding container f1ad7ea1a0b21e6aeb1df462ce247b1787d2a052985705573739a1a1df314112: Status 404 returned error can't find the container with id f1ad7ea1a0b21e6aeb1df462ce247b1787d2a052985705573739a1a1df314112 Apr 17 17:24:53.391175 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:53.391136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xckw7" event={"ID":"58b27aad-7dee-48c9-bbb0-67164a5931c5","Type":"ContainerStarted","Data":"2b4ae47cf9ac9d747ca21f9d48463fb6abc3b92d5c699953d8203cce94be5519"} Apr 17 17:24:53.391605 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:53.391241 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:24:53.392391 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:53.392366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tkbg5" event={"ID":"a32594c3-cfcc-4c83-88d7-263f8c57a927","Type":"ContainerStarted","Data":"f1ad7ea1a0b21e6aeb1df462ce247b1787d2a052985705573739a1a1df314112"} Apr 17 17:24:53.411445 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:53.411398 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xckw7" podStartSLOduration=35.340747336 podStartE2EDuration="38.41138425s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:49.485367698 +0000 UTC m=+34.801449521" lastFinishedPulling="2026-04-17 17:24:52.556004611 +0000 UTC m=+37.872086435" observedRunningTime="2026-04-17 17:24:53.410395675 +0000 UTC m=+38.726477516" watchObservedRunningTime="2026-04-17 17:24:53.41138425 +0000 UTC m=+38.727466089" Apr 17 17:24:54.392236 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:54.392206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:54.392266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:54.392305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392355 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392419 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392445 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392419 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392425 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:25:02.392406139 +0000 UTC m=+47.708487974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392546 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:02.392519593 +0000 UTC m=+47.708601428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:24:54.392693 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:24:54.392565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:02.392555629 +0000 UTC m=+47.708637452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:24:57.312039 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:57.311953 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l6j9_9b0827a8-3eb5-4863-995e-e708ba3f0756/dns-node-resolver/0.log" Apr 17 17:24:57.401391 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:57.401352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tkbg5" event={"ID":"a32594c3-cfcc-4c83-88d7-263f8c57a927","Type":"ContainerStarted","Data":"628f0e1420430d7222e610585ca3639bcf97fdf05f53cc622ed9ab5893c9be12"} Apr 17 17:24:57.417669 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:57.417625 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tkbg5" podStartSLOduration=32.79962565 podStartE2EDuration="37.417612349s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:52.400666639 +0000 UTC m=+37.716748459" lastFinishedPulling="2026-04-17 17:24:57.018653325 +0000 UTC m=+42.334735158" observedRunningTime="2026-04-17 17:24:57.417373269 +0000 UTC m=+42.733455114" watchObservedRunningTime="2026-04-17 17:24:57.417612349 +0000 UTC m=+42.733694190" Apr 17 17:24:58.713083 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:24:58.713056 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whd7s_96e2a5d4-ba15-434e-9514-847bbd7fec29/node-ca/0.log" Apr 17 17:25:02.451407 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:02.451370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:02.451427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:02.451460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451531 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451546 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451557 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55c4b85f64-487p4: secret "image-registry-tls" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451567 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451603 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls podName:4b2224cd-10b0-4bf7-bb36-af52cc8d4236 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:18.45158963 +0000 UTC m=+63.767671450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls") pod "image-registry-55c4b85f64-487p4" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236") : secret "image-registry-tls" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451615 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls podName:85412bc5-20cb-438b-9637-8f85717abf24 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:18.451609584 +0000 UTC m=+63.767691403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls") pod "dns-default-77nfq" (UID: "85412bc5-20cb-438b-9637-8f85717abf24") : secret "dns-default-metrics-tls" not found Apr 17 17:25:02.451957 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:02.451625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert podName:967a37c4-4cd4-49ce-9611-47bd8e1bf9dd nodeName:}" failed. No retries permitted until 2026-04-17 17:25:18.451619881 +0000 UTC m=+63.767701701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert") pod "ingress-canary-l4qxr" (UID: "967a37c4-4cd4-49ce-9611-47bd8e1bf9dd") : secret "canary-serving-cert" not found Apr 17 17:25:12.367730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:12.367703 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z92s2" Apr 17 17:25:18.461438 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.461403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:18.461900 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.461445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:18.461900 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.461479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:25:18.464318 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.464289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85412bc5-20cb-438b-9637-8f85717abf24-metrics-tls\") pod \"dns-default-77nfq\" (UID: \"85412bc5-20cb-438b-9637-8f85717abf24\") " pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:18.464429 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.464417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967a37c4-4cd4-49ce-9611-47bd8e1bf9dd-cert\") pod \"ingress-canary-l4qxr\" (UID: \"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd\") " pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:25:18.474780 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.474756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"image-registry-55c4b85f64-487p4\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:18.703386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.703354 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vv5sf\"" Apr 17 17:25:18.711261 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.711243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:18.723777 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.723759 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cl7ph\"" Apr 17 17:25:18.732093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.732071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:18.739442 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.739420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rb846\"" Apr 17 17:25:18.747097 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.747070 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l4qxr" Apr 17 17:25:18.842485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.842425 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:25:18.845628 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:18.845574 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2224cd_10b0_4bf7_bb36_af52cc8d4236.slice/crio-2482fbb972bc8b886a9c258d4175e5dbd0c87005eaf68aaed0cd72f354fd9585 WatchSource:0}: Error finding container 2482fbb972bc8b886a9c258d4175e5dbd0c87005eaf68aaed0cd72f354fd9585: Status 404 returned error can't find the container with id 2482fbb972bc8b886a9c258d4175e5dbd0c87005eaf68aaed0cd72f354fd9585 Apr 17 17:25:18.868045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.868009 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77nfq"] Apr 17 17:25:18.881712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:18.881691 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l4qxr"] Apr 17 17:25:18.891163 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:18.891141 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85412bc5_20cb_438b_9637_8f85717abf24.slice/crio-31c71c0da2a567f41c7fcae19b0e65ebf6f30773361865a6421ee4e452b5ec89 WatchSource:0}: Error finding container 31c71c0da2a567f41c7fcae19b0e65ebf6f30773361865a6421ee4e452b5ec89: Status 404 returned error can't find the container with id 31c71c0da2a567f41c7fcae19b0e65ebf6f30773361865a6421ee4e452b5ec89 Apr 17 17:25:18.891651 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:18.891626 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967a37c4_4cd4_49ce_9611_47bd8e1bf9dd.slice/crio-24620ce127676649bbee48e18b66b32ce3a6bfb883609bf365e5ce633c3ea13e WatchSource:0}: Error finding container 24620ce127676649bbee48e18b66b32ce3a6bfb883609bf365e5ce633c3ea13e: Status 404 returned error can't find the container with id 24620ce127676649bbee48e18b66b32ce3a6bfb883609bf365e5ce633c3ea13e Apr 17 17:25:19.444597 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.444553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l4qxr" event={"ID":"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd","Type":"ContainerStarted","Data":"24620ce127676649bbee48e18b66b32ce3a6bfb883609bf365e5ce633c3ea13e"} Apr 17 17:25:19.445720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.445693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77nfq" event={"ID":"85412bc5-20cb-438b-9637-8f85717abf24","Type":"ContainerStarted","Data":"31c71c0da2a567f41c7fcae19b0e65ebf6f30773361865a6421ee4e452b5ec89"} Apr 17 17:25:19.447404 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.447378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" event={"ID":"4b2224cd-10b0-4bf7-bb36-af52cc8d4236","Type":"ContainerStarted","Data":"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e"} Apr 17 17:25:19.447498 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.447414 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" event={"ID":"4b2224cd-10b0-4bf7-bb36-af52cc8d4236","Type":"ContainerStarted","Data":"2482fbb972bc8b886a9c258d4175e5dbd0c87005eaf68aaed0cd72f354fd9585"} Apr 17 17:25:19.447615 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.447600 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:19.469406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:19.469356 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" podStartSLOduration=54.469339534 podStartE2EDuration="54.469339534s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:19.469294087 +0000 UTC m=+64.785375929" watchObservedRunningTime="2026-04-17 17:25:19.469339534 +0000 UTC m=+64.785421373" Apr 17 17:25:20.683631 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.683598 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb"] Apr 17 17:25:20.704315 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.704290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.707124 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.706950 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:20.707248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.707172 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:25:20.707248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.707238 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:20.708754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.708047 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:20.708754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.708102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:25:20.708754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.708147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:25:20.708754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.708504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:25:20.711828 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.711792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb"] Apr 17 17:25:20.745124 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.745099 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:25:20.781279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.781279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvjf4\" (UniqueName: \"kubernetes.io/projected/3cbcac26-04eb-4416-bb17-83348aa7f24e-kube-api-access-hvjf4\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.781477 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.781477 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.781584 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.781584 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.781506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcac26-04eb-4416-bb17-83348aa7f24e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.849463 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.849434 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xrs4v"] Apr 17 17:25:20.863167 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.863145 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.872579 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.872561 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:20.874894 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.874877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:20.877458 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.877443 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-42cdb\"" Apr 17 17:25:20.878460 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.878446 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:20.881977 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.881960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.881990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.882009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcac26-04eb-4416-bb17-83348aa7f24e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.882052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.882073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvjf4\" (UniqueName: \"kubernetes.io/projected/3cbcac26-04eb-4416-bb17-83348aa7f24e-kube-api-access-hvjf4\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.882101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.882747 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.882725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3cbcac26-04eb-4416-bb17-83348aa7f24e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.883109 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.883087 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:20.884464 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.884434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.884464 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.884447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-ca\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.884615 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.884446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.884615 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.884567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3cbcac26-04eb-4416-bb17-83348aa7f24e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.898167 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.898146 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xrs4v"] Apr 17 17:25:20.920565 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.920545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvjf4\" (UniqueName: \"kubernetes.io/projected/3cbcac26-04eb-4416-bb17-83348aa7f24e-kube-api-access-hvjf4\") pod \"cluster-proxy-proxy-agent-6bd76c8455-ktdxb\" (UID: \"3cbcac26-04eb-4416-bb17-83348aa7f24e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:20.983439 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:25:20.983439 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20ec495a-dce4-465c-b00c-8812ae039f7b-data-volume\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.983602 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.983602 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20ec495a-dce4-465c-b00c-8812ae039f7b-crio-socket\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.983602 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20ec495a-dce4-465c-b00c-8812ae039f7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.983695 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.983604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p68qn\" (UniqueName: \"kubernetes.io/projected/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-api-access-p68qn\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:20.985620 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:20.985603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a06ed33-b68b-4c09-88ed-8a7af0e52ef4-metrics-certs\") pod \"network-metrics-daemon-llf4n\" (UID: \"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4\") " pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:25:21.034574 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.034548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" Apr 17 17:25:21.084214 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20ec495a-dce4-465c-b00c-8812ae039f7b-data-volume\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084360 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084360 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20ec495a-dce4-465c-b00c-8812ae039f7b-crio-socket\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084360 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20ec495a-dce4-465c-b00c-8812ae039f7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084360 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p68qn\" (UniqueName: \"kubernetes.io/projected/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-api-access-p68qn\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084360 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20ec495a-dce4-465c-b00c-8812ae039f7b-crio-socket\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084632 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20ec495a-dce4-465c-b00c-8812ae039f7b-data-volume\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.084793 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.084776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.086581 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.086562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20ec495a-dce4-465c-b00c-8812ae039f7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.103095 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.103069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p68qn\" (UniqueName: \"kubernetes.io/projected/20ec495a-dce4-465c-b00c-8812ae039f7b-kube-api-access-p68qn\") pod \"insights-runtime-extractor-xrs4v\" (UID: \"20ec495a-dce4-465c-b00c-8812ae039f7b\") " pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.119119 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.119093 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:25:21.126489 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.126471 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llf4n" Apr 17 17:25:21.171091 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.171064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xrs4v" Apr 17 17:25:21.359742 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.359712 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xrs4v"] Apr 17 17:25:21.380494 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.380457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llf4n"] Apr 17 17:25:21.383819 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:21.383795 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a06ed33_b68b_4c09_88ed_8a7af0e52ef4.slice/crio-4032ca5a2524d7474571503c3cd8a4682742e74b598c598d5bef65f6a50f29cc WatchSource:0}: Error finding container 4032ca5a2524d7474571503c3cd8a4682742e74b598c598d5bef65f6a50f29cc: Status 404 returned error can't find the container with id 4032ca5a2524d7474571503c3cd8a4682742e74b598c598d5bef65f6a50f29cc Apr 17 17:25:21.400387 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.400179 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb"] Apr 17 17:25:21.406760 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:21.406732 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cbcac26_04eb_4416_bb17_83348aa7f24e.slice/crio-6c7dc12710868683d1851c667ca246ecf59bc680249755a140fa85f935774fd5 WatchSource:0}: Error finding container 6c7dc12710868683d1851c667ca246ecf59bc680249755a140fa85f935774fd5: Status 404 returned error can't find the container with id 6c7dc12710868683d1851c667ca246ecf59bc680249755a140fa85f935774fd5 Apr 17 17:25:21.459634 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.459594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" event={"ID":"3cbcac26-04eb-4416-bb17-83348aa7f24e","Type":"ContainerStarted","Data":"6c7dc12710868683d1851c667ca246ecf59bc680249755a140fa85f935774fd5"} Apr 17 17:25:21.460990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.460945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77nfq" event={"ID":"85412bc5-20cb-438b-9637-8f85717abf24","Type":"ContainerStarted","Data":"d07d3bf6b112e807fb2dc881ccdfaaec29fdbf462f9d5eda2eca65bd10fd4fe7"} Apr 17 17:25:21.462156 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.462130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llf4n" event={"ID":"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4","Type":"ContainerStarted","Data":"4032ca5a2524d7474571503c3cd8a4682742e74b598c598d5bef65f6a50f29cc"} Apr 17 17:25:21.466665 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.466640 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xrs4v" event={"ID":"20ec495a-dce4-465c-b00c-8812ae039f7b","Type":"ContainerStarted","Data":"edd65a2f3163037cd29c92e1318d6c4cb469394b61455787ed17b68ca2f5201c"} Apr 17 17:25:21.466665 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.466665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xrs4v" event={"ID":"20ec495a-dce4-465c-b00c-8812ae039f7b","Type":"ContainerStarted","Data":"733834f81af49eed54484d12028fd312c4acab743699c422dcf3b3ca8bca22af"} Apr 17 17:25:21.468173 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.468149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l4qxr" event={"ID":"967a37c4-4cd4-49ce-9611-47bd8e1bf9dd","Type":"ContainerStarted","Data":"9668e3af35a94c0ec55392f471e1b09aac43a2530388556b494922cf2dc63908"} Apr 17 17:25:21.487838 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:21.487153 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l4qxr" podStartSLOduration=33.178719401 podStartE2EDuration="35.487139662s" podCreationTimestamp="2026-04-17 17:24:46 +0000 UTC" firstStartedPulling="2026-04-17 17:25:18.894454234 +0000 UTC m=+64.210536057" lastFinishedPulling="2026-04-17 17:25:21.202874487 +0000 UTC m=+66.518956318" observedRunningTime="2026-04-17 17:25:21.485068907 +0000 UTC m=+66.801150750" watchObservedRunningTime="2026-04-17 17:25:21.487139662 +0000 UTC m=+66.803221504" Apr 17 17:25:22.474969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:22.473184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77nfq" event={"ID":"85412bc5-20cb-438b-9637-8f85717abf24","Type":"ContainerStarted","Data":"54645b1903f1e02abda3f1c3841c7debb8074fffad6f1ad11514a8045733c476"} Apr 17 17:25:22.474969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:22.473871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:22.496388 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:22.495440 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-77nfq" podStartSLOduration=34.187063798 podStartE2EDuration="36.495427561s" podCreationTimestamp="2026-04-17 17:24:46 +0000 UTC" firstStartedPulling="2026-04-17 17:25:18.894509691 +0000 UTC m=+64.210591511" lastFinishedPulling="2026-04-17 17:25:21.202873454 +0000 UTC m=+66.518955274" observedRunningTime="2026-04-17 17:25:22.494932395 +0000 UTC m=+67.811014250" watchObservedRunningTime="2026-04-17 17:25:22.495427561 +0000 UTC m=+67.811509407" Apr 17 17:25:23.480824 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:23.480784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xrs4v" event={"ID":"20ec495a-dce4-465c-b00c-8812ae039f7b","Type":"ContainerStarted","Data":"ade1967cde04a8249de7dfd067873c1e5860468afab19d06bd5bd5e08295b8a0"} Apr 17 17:25:23.482564 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:23.482525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llf4n" event={"ID":"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4","Type":"ContainerStarted","Data":"29918d1c6789b12771528dcffb6519c16736765c88d7f4c5bc82d87e0afafd6f"} Apr 17 17:25:23.482713 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:23.482571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llf4n" event={"ID":"8a06ed33-b68b-4c09-88ed-8a7af0e52ef4","Type":"ContainerStarted","Data":"48fc56a992188febefab353912f0c265c5cb10b665f9b1c35f42cc7070504b08"} Apr 17 17:25:23.502587 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:23.502539 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-llf4n" podStartSLOduration=67.40321682 podStartE2EDuration="1m8.502523332s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.386150566 +0000 UTC m=+66.702232385" lastFinishedPulling="2026-04-17 17:25:22.485457074 +0000 UTC m=+67.801538897" observedRunningTime="2026-04-17 17:25:23.501480358 +0000 UTC m=+68.817562213" watchObservedRunningTime="2026-04-17 17:25:23.502523332 +0000 UTC m=+68.818605175" Apr 17 17:25:24.397668 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:24.397633 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xckw7" Apr 17 17:25:25.490385 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:25.490342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" event={"ID":"3cbcac26-04eb-4416-bb17-83348aa7f24e","Type":"ContainerStarted","Data":"8fe28eebefe0fba13e1daee6aae68a36a1fba5b2f6d472ed47e333b796622d49"} Apr 17 17:25:25.492320 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:25.492297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xrs4v" event={"ID":"20ec495a-dce4-465c-b00c-8812ae039f7b","Type":"ContainerStarted","Data":"ea3f1f48c8d727a7ae2da34ca0648cb1334b93d00908ee302412b70c4fbc321c"} Apr 17 17:25:25.512888 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:25.512841 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xrs4v" podStartSLOduration=2.1977740040000002 podStartE2EDuration="5.512828311s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.419950131 +0000 UTC m=+66.736031960" lastFinishedPulling="2026-04-17 17:25:24.735004444 +0000 UTC m=+70.051086267" observedRunningTime="2026-04-17 17:25:25.512628749 +0000 UTC m=+70.828710591" watchObservedRunningTime="2026-04-17 17:25:25.512828311 +0000 UTC m=+70.828910153" Apr 17 17:25:28.502700 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:28.502663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" event={"ID":"3cbcac26-04eb-4416-bb17-83348aa7f24e","Type":"ContainerStarted","Data":"bbd2fa397064ba45bbd3e5a626acd7c7f79854c021ba3f0734ce294c0cbec403"} Apr 17 17:25:28.502700 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:28.502702 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" event={"ID":"3cbcac26-04eb-4416-bb17-83348aa7f24e","Type":"ContainerStarted","Data":"adf9d2c2381755ae0aaad6571833a14b715496005cf08381485c40fcb8f954dc"} Apr 17 17:25:28.527343 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:28.527298 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bd76c8455-ktdxb" podStartSLOduration=2.189993899 podStartE2EDuration="8.527284248s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.409001242 +0000 UTC m=+66.725083071" lastFinishedPulling="2026-04-17 17:25:27.746291599 +0000 UTC m=+73.062373420" observedRunningTime="2026-04-17 17:25:28.525731474 +0000 UTC m=+73.841813319" watchObservedRunningTime="2026-04-17 17:25:28.527284248 +0000 UTC m=+73.843366089" Apr 17 17:25:29.997400 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:29.997362 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4"] Apr 17 17:25:30.022500 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.022474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4"] Apr 17 17:25:30.022500 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.022499 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lklgp"] Apr 17 17:25:30.022715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.022627 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.026396 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.026379 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:30.027115 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.027097 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:30.027192 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.027124 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:25:30.028421 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.028401 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:25:30.028520 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.028482 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qd9ps\"" Apr 17 17:25:30.028663 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.028640 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:30.040739 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.040719 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ndc6q"] Apr 17 17:25:30.040849 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.040833 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.043428 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.043413 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:25:30.044537 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.044512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-tkhc7\"" Apr 17 17:25:30.044689 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.044560 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:25:30.044801 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.044715 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:25:30.062823 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.062807 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lklgp"] Apr 17 17:25:30.062911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.062905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.065515 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.065499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:30.065617 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.065527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:30.065853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.065818 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ts64k\"" Apr 17 17:25:30.066277 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.066251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:30.148383 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.148383 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-sys\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-textfile\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.148571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d29d095-2263-4fd2-98c2-ee342220c960-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.148571 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnwdh\" (UniqueName: \"kubernetes.io/projected/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-kube-api-access-nnwdh\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148625 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-root\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgpd2\" (UniqueName: \"kubernetes.io/projected/5d29d095-2263-4fd2-98c2-ee342220c960-kube-api-access-tgpd2\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrj5\" (UniqueName: \"kubernetes.io/projected/bd7121ef-8833-411e-9ab4-3de1db83ef61-kube-api-access-cnrj5\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-metrics-client-ca\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148745 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.148969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.148969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-wtmp\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.148969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.148969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.148826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.249758 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.249758 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.249758 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-wtmp\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-sys\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-textfile\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-wtmp\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.249999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.249975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d29d095-2263-4fd2-98c2-ee342220c960-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnwdh\" (UniqueName: \"kubernetes.io/projected/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-kube-api-access-nnwdh\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-root\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgpd2\" (UniqueName: \"kubernetes.io/projected/5d29d095-2263-4fd2-98c2-ee342220c960-kube-api-access-tgpd2\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrj5\" (UniqueName: \"kubernetes.io/projected/bd7121ef-8833-411e-9ab4-3de1db83ef61-kube-api-access-cnrj5\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-textfile\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-metrics-client-ca\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-sys\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:30.250412 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:30.250465 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:30.250470 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls podName:bd7121ef-8833-411e-9ab4-3de1db83ef61 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:30.750452431 +0000 UTC m=+76.066534253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls") pod "node-exporter-ndc6q" (UID: "bd7121ef-8833-411e-9ab4-3de1db83ef61") : secret "node-exporter-tls" not found Apr 17 17:25:30.251092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.251092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.251092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d29d095-2263-4fd2-98c2-ee342220c960-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.251092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.250414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bd7121ef-8833-411e-9ab4-3de1db83ef61-root\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.251407 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.251384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-metrics-client-ca\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.251625 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.251602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.251934 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.251907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d29d095-2263-4fd2-98c2-ee342220c960-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.252653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.252628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.252653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.252650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.252788 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.252747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.252966 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.252950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.253176 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.253155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d29d095-2263-4fd2-98c2-ee342220c960-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.260585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.260550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnwdh\" (UniqueName: \"kubernetes.io/projected/e9b9ea6f-26a0-4cca-b8be-5bedbf607826-kube-api-access-nnwdh\") pod \"openshift-state-metrics-9d44df66c-24sl4\" (UID: \"e9b9ea6f-26a0-4cca-b8be-5bedbf607826\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.260667 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.260606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrj5\" (UniqueName: \"kubernetes.io/projected/bd7121ef-8833-411e-9ab4-3de1db83ef61-kube-api-access-cnrj5\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.261194 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.261172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgpd2\" (UniqueName: \"kubernetes.io/projected/5d29d095-2263-4fd2-98c2-ee342220c960-kube-api-access-tgpd2\") pod \"kube-state-metrics-69db897b98-lklgp\" (UID: \"5d29d095-2263-4fd2-98c2-ee342220c960\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.330947 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.330919 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" Apr 17 17:25:30.348905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.348881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" Apr 17 17:25:30.488195 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.488167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4"] Apr 17 17:25:30.509241 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:30.509210 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b9ea6f_26a0_4cca_b8be_5bedbf607826.slice/crio-e27fe81a9e5e06338f02f6edfd10b1c6c0390b9c9fef000c007d51b2925f981a WatchSource:0}: Error finding container e27fe81a9e5e06338f02f6edfd10b1c6c0390b9c9fef000c007d51b2925f981a: Status 404 returned error can't find the container with id e27fe81a9e5e06338f02f6edfd10b1c6c0390b9c9fef000c007d51b2925f981a Apr 17 17:25:30.510123 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.510104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lklgp"] Apr 17 17:25:30.511886 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:30.511862 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d29d095_2263_4fd2_98c2_ee342220c960.slice/crio-d61c19dce8739889e9299205bb4039cfee29f395ec9fb9453167010835a24d1a WatchSource:0}: Error finding container d61c19dce8739889e9299205bb4039cfee29f395ec9fb9453167010835a24d1a: Status 404 returned error can't find the container with id d61c19dce8739889e9299205bb4039cfee29f395ec9fb9453167010835a24d1a Apr 17 17:25:30.755250 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.755214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.757420 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.757377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bd7121ef-8833-411e-9ab4-3de1db83ef61-node-exporter-tls\") pod \"node-exporter-ndc6q\" (UID: \"bd7121ef-8833-411e-9ab4-3de1db83ef61\") " pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.971032 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:30.970996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ndc6q" Apr 17 17:25:30.978529 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:30.978504 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7121ef_8833_411e_9ab4_3de1db83ef61.slice/crio-4ed736554c496c33ffe649c3a7faa182d318257e1771df69e69d811d8f5a4ddf WatchSource:0}: Error finding container 4ed736554c496c33ffe649c3a7faa182d318257e1771df69e69d811d8f5a4ddf: Status 404 returned error can't find the container with id 4ed736554c496c33ffe649c3a7faa182d318257e1771df69e69d811d8f5a4ddf Apr 17 17:25:31.520140 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:31.520066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndc6q" event={"ID":"bd7121ef-8833-411e-9ab4-3de1db83ef61","Type":"ContainerStarted","Data":"4ed736554c496c33ffe649c3a7faa182d318257e1771df69e69d811d8f5a4ddf"} Apr 17 17:25:31.522296 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:31.522265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" event={"ID":"e9b9ea6f-26a0-4cca-b8be-5bedbf607826","Type":"ContainerStarted","Data":"acaf0de2c15d7fefbe46584bdd586f269a265c59a824bb878d0d4064264a4218"} Apr 17 17:25:31.522432 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:31.522301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" event={"ID":"e9b9ea6f-26a0-4cca-b8be-5bedbf607826","Type":"ContainerStarted","Data":"b4e768167e0ebc83e99b18181097d464af91cf8077a33f3c70c4e84d8cffb9e4"} Apr 17 17:25:31.522432 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:31.522315 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" event={"ID":"e9b9ea6f-26a0-4cca-b8be-5bedbf607826","Type":"ContainerStarted","Data":"e27fe81a9e5e06338f02f6edfd10b1c6c0390b9c9fef000c007d51b2925f981a"} Apr 17 17:25:31.523405 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:31.523370 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" event={"ID":"5d29d095-2263-4fd2-98c2-ee342220c960","Type":"ContainerStarted","Data":"d61c19dce8739889e9299205bb4039cfee29f395ec9fb9453167010835a24d1a"} Apr 17 17:25:32.528169 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.528134 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd7121ef-8833-411e-9ab4-3de1db83ef61" containerID="fda00bb991b936be32e08990e5b0f07b4f55efc19ec89242fe238cb686ac68ba" exitCode=0 Apr 17 17:25:32.528623 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.528223 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndc6q" event={"ID":"bd7121ef-8833-411e-9ab4-3de1db83ef61","Type":"ContainerDied","Data":"fda00bb991b936be32e08990e5b0f07b4f55efc19ec89242fe238cb686ac68ba"} Apr 17 17:25:32.530179 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.530148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" event={"ID":"e9b9ea6f-26a0-4cca-b8be-5bedbf607826","Type":"ContainerStarted","Data":"2a3d1e5f87199cedeb877dde432b911d4d666810c948c352d4621b6d49dffac0"} Apr 17 17:25:32.532047 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.532005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" event={"ID":"5d29d095-2263-4fd2-98c2-ee342220c960","Type":"ContainerStarted","Data":"63e0955d5088e838ed6ec7c28ce3d8b6e871616c2420889606162d9281d43fbe"} Apr 17 17:25:32.532130 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.532052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" event={"ID":"5d29d095-2263-4fd2-98c2-ee342220c960","Type":"ContainerStarted","Data":"a4d11179e70d05d205c5140f3b8931bf496377c72bf4284b9d4710614317738f"} Apr 17 17:25:32.532130 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.532062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" event={"ID":"5d29d095-2263-4fd2-98c2-ee342220c960","Type":"ContainerStarted","Data":"c100e3af98a79eeb78df4a1e0ddf547231cdeb100cac26ee53df4d3af8e113d2"} Apr 17 17:25:32.576537 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.576490 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-lklgp" podStartSLOduration=1.044439632 podStartE2EDuration="2.576477126s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.513649652 +0000 UTC m=+75.829731473" lastFinishedPulling="2026-04-17 17:25:32.045687144 +0000 UTC m=+77.361768967" observedRunningTime="2026-04-17 17:25:32.575282971 +0000 UTC m=+77.891364813" watchObservedRunningTime="2026-04-17 17:25:32.576477126 +0000 UTC m=+77.892558968" Apr 17 17:25:32.603660 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:32.603622 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-24sl4" podStartSLOduration=2.191776206 podStartE2EDuration="3.603610003s" podCreationTimestamp="2026-04-17 17:25:29 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.631705918 +0000 UTC m=+75.947787739" lastFinishedPulling="2026-04-17 17:25:32.043539716 +0000 UTC m=+77.359621536" observedRunningTime="2026-04-17 17:25:32.60179039 +0000 UTC m=+77.917872231" watchObservedRunningTime="2026-04-17 17:25:32.603610003 +0000 UTC m=+77.919691864" Apr 17 17:25:33.487634 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:33.487604 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-77nfq" Apr 17 17:25:33.537195 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:33.537157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndc6q" event={"ID":"bd7121ef-8833-411e-9ab4-3de1db83ef61","Type":"ContainerStarted","Data":"a97e166bbfafc49f39562eb61f0349f82b7d5c420d649c5800897bbbc7a8f8df"} Apr 17 17:25:33.537586 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:33.537205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndc6q" event={"ID":"bd7121ef-8833-411e-9ab4-3de1db83ef61","Type":"ContainerStarted","Data":"a14af55f3a71e9b41582b42b9463a5c6f7b4759e38c05d35513333515975647c"} Apr 17 17:25:33.554965 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:33.554861 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ndc6q" podStartSLOduration=2.4912657879999998 podStartE2EDuration="3.55484315s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.980062521 +0000 UTC m=+76.296144344" lastFinishedPulling="2026-04-17 17:25:32.043639883 +0000 UTC m=+77.359721706" observedRunningTime="2026-04-17 17:25:33.553983282 +0000 UTC m=+78.870065125" watchObservedRunningTime="2026-04-17 17:25:33.55484315 +0000 UTC m=+78.870924993" Apr 17 17:25:36.211667 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.211560 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:25:36.216380 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.216363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.220822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.220797 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:25:36.220909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.220851 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:25:36.220958 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.220944 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:25:36.221459 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.221435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:25:36.221567 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.221526 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:25:36.221652 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.221598 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:25:36.221772 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.221755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:25:36.221941 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.221909 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:25:36.222167 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.222149 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:25:36.222167 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.222160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dsmsg\"" Apr 17 17:25:36.222309 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.222157 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:25:36.222309 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.222208 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cf8lnue8b2bhd\"" Apr 17 17:25:36.223540 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.223525 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:25:36.224425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.224409 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:25:36.224530 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.224506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:25:36.233769 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.233747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:25:36.295803 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.295905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.295905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.295905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.295986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmh5\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.296357 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.296330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396590 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.396861 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.396685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397225 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397225 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmh5\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.397901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.399636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.399608 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.399797 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.399776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.400227 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.400200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.400333 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.400308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.400506 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.400476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.400605 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.400579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.400662 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.397858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.402425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.401305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.402425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.401479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.402425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.401606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.402425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.402123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.402864 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.402839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.403562 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.403540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.404323 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.404303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.407585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.407561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.408359 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.408342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmh5\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5\") pod \"prometheus-k8s-0\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.526905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.526831 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:36.648805 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:36.648780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:25:36.651307 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:25:36.651280 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdabd6a2_fd3b_4b45_9884_50f4cedac9eb.slice/crio-121fade55067406ef398d267181eb7d74d3096eb5a04f9f5c2e1b761a4f116ae WatchSource:0}: Error finding container 121fade55067406ef398d267181eb7d74d3096eb5a04f9f5c2e1b761a4f116ae: Status 404 returned error can't find the container with id 121fade55067406ef398d267181eb7d74d3096eb5a04f9f5c2e1b761a4f116ae Apr 17 17:25:37.548638 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:37.548612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"05a8bfdd9b407639b64471d7c6eb4afd6b77ee3100d9ddec7759a1b9b62db06c"} Apr 17 17:25:37.548933 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:37.548649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"121fade55067406ef398d267181eb7d74d3096eb5a04f9f5c2e1b761a4f116ae"} Apr 17 17:25:38.552240 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:38.552205 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="05a8bfdd9b407639b64471d7c6eb4afd6b77ee3100d9ddec7759a1b9b62db06c" exitCode=0 Apr 17 17:25:38.552677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:38.552281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"05a8bfdd9b407639b64471d7c6eb4afd6b77ee3100d9ddec7759a1b9b62db06c"} Apr 17 17:25:41.474477 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:41.474445 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:42.566271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:42.566225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"293f2e0c2e09fbd47008513d8289d687fdfbeb6b9894efb0e4b35caf28e8dcc5"} Apr 17 17:25:42.566271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:42.566266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"598024d0299ebdec827a4f51e15a3b321411dfe99f79c101e8c8fa7f93a49984"} Apr 17 17:25:43.572136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:43.572062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"89727b57dc17340c4c2a362d59a4b875bc72746434717eafc23b229468adb487"} Apr 17 17:25:43.572136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:43.572101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"9fc62a2c246bebac2935fb24bd863cc36df724d93b7e6d30a27c3cc312bdf4d7"} Apr 17 17:25:43.572136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:43.572113 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"b62b309f25b53cf9c6149751aad6925701fe4fc236e0259a0cd515bfbf477fc9"} Apr 17 17:25:43.572136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:43.572125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerStarted","Data":"660cf3ca5b750381abb2fcd7f7ff928d4913172d054968ca2f499fd3da7702be"} Apr 17 17:25:43.612843 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:43.612789 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=0.960930583 podStartE2EDuration="7.612771044s" podCreationTimestamp="2026-04-17 17:25:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:36.653642405 +0000 UTC m=+81.969724225" lastFinishedPulling="2026-04-17 17:25:43.305482851 +0000 UTC m=+88.621564686" observedRunningTime="2026-04-17 17:25:43.610543017 +0000 UTC m=+88.926624858" watchObservedRunningTime="2026-04-17 17:25:43.612771044 +0000 UTC m=+88.928852890" Apr 17 17:25:46.491303 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.491261 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" podUID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" containerName="registry" containerID="cri-o://884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e" gracePeriod=30 Apr 17 17:25:46.526987 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.526960 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:25:46.726754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.726728 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:46.883489 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883467 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883501 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883552 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883569 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883593 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883626 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.883808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.883764 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nfr\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr\") pod \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\" (UID: \"4b2224cd-10b0-4bf7-bb36-af52cc8d4236\") " Apr 17 17:25:46.884254 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.884224 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:25:46.884392 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.884300 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:25:46.886101 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.886062 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:25:46.886344 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.886312 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:25:46.886439 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.886343 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:25:46.886479 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.886470 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr" (OuterVolumeSpecName: "kube-api-access-87nfr") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "kube-api-access-87nfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:25:46.886521 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.886475 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:25:46.891658 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.891636 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4b2224cd-10b0-4bf7-bb36-af52cc8d4236" (UID: "4b2224cd-10b0-4bf7-bb36-af52cc8d4236"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:25:46.984900 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984872 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-installation-pull-secrets\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.984900 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984896 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-certificates\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984909 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87nfr\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-kube-api-access-87nfr\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984918 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-ca-trust-extracted\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984928 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-registry-tls\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984936 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-bound-sa-token\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984945 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-image-registry-private-configuration\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:46.985053 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:46.984954 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b2224cd-10b0-4bf7-bb36-af52cc8d4236-trusted-ca\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:25:47.584691 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.584594 2573 generic.go:358] "Generic (PLEG): container finished" podID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" containerID="884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e" exitCode=0 Apr 17 17:25:47.584691 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.584683 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" Apr 17 17:25:47.585291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.584679 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" event={"ID":"4b2224cd-10b0-4bf7-bb36-af52cc8d4236","Type":"ContainerDied","Data":"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e"} Apr 17 17:25:47.585291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.584789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c4b85f64-487p4" event={"ID":"4b2224cd-10b0-4bf7-bb36-af52cc8d4236","Type":"ContainerDied","Data":"2482fbb972bc8b886a9c258d4175e5dbd0c87005eaf68aaed0cd72f354fd9585"} Apr 17 17:25:47.585291 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.584815 2573 scope.go:117] "RemoveContainer" containerID="884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e" Apr 17 17:25:47.593995 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.593981 2573 scope.go:117] "RemoveContainer" containerID="884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e" Apr 17 17:25:47.594264 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:25:47.594245 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e\": container with ID starting with 884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e not found: ID does not exist" containerID="884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e" Apr 17 17:25:47.594331 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.594271 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e"} err="failed to get container status \"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e\": rpc error: code = NotFound desc = could not find container \"884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e\": container with ID starting with 884be2c45927f509a95d4424be9f5442604bda194b3da268d081c9155832728e not found: ID does not exist" Apr 17 17:25:47.600369 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.600344 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:25:47.604575 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:47.604554 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55c4b85f64-487p4"] Apr 17 17:25:49.195958 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:25:49.195929 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" path="/var/lib/kubelet/pods/4b2224cd-10b0-4bf7-bb36-af52cc8d4236/volumes" Apr 17 17:26:36.527071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:36.527038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:36.545060 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:36.545037 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:36.726738 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:36.726712 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:54.579228 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.579189 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:54.579824 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.579771 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="prometheus" containerID="cri-o://598024d0299ebdec827a4f51e15a3b321411dfe99f79c101e8c8fa7f93a49984" gracePeriod=600 Apr 17 17:26:54.580352 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.580325 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://89727b57dc17340c4c2a362d59a4b875bc72746434717eafc23b229468adb487" gracePeriod=600 Apr 17 17:26:54.580551 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.580533 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy" containerID="cri-o://9fc62a2c246bebac2935fb24bd863cc36df724d93b7e6d30a27c3cc312bdf4d7" gracePeriod=600 Apr 17 17:26:54.580695 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.580671 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-web" containerID="cri-o://b62b309f25b53cf9c6149751aad6925701fe4fc236e0259a0cd515bfbf477fc9" gracePeriod=600 Apr 17 17:26:54.580829 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.580798 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="thanos-sidecar" containerID="cri-o://660cf3ca5b750381abb2fcd7f7ff928d4913172d054968ca2f499fd3da7702be" gracePeriod=600 Apr 17 17:26:54.581478 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.580974 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="config-reloader" containerID="cri-o://293f2e0c2e09fbd47008513d8289d687fdfbeb6b9894efb0e4b35caf28e8dcc5" gracePeriod=600 Apr 17 17:26:54.762707 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762672 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="89727b57dc17340c4c2a362d59a4b875bc72746434717eafc23b229468adb487" exitCode=0 Apr 17 17:26:54.762707 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762704 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="9fc62a2c246bebac2935fb24bd863cc36df724d93b7e6d30a27c3cc312bdf4d7" exitCode=0 Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762717 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="b62b309f25b53cf9c6149751aad6925701fe4fc236e0259a0cd515bfbf477fc9" exitCode=0 Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"89727b57dc17340c4c2a362d59a4b875bc72746434717eafc23b229468adb487"} Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762759 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"9fc62a2c246bebac2935fb24bd863cc36df724d93b7e6d30a27c3cc312bdf4d7"} Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"b62b309f25b53cf9c6149751aad6925701fe4fc236e0259a0cd515bfbf477fc9"} Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"660cf3ca5b750381abb2fcd7f7ff928d4913172d054968ca2f499fd3da7702be"} Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762729 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="660cf3ca5b750381abb2fcd7f7ff928d4913172d054968ca2f499fd3da7702be" exitCode=0 Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762799 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="293f2e0c2e09fbd47008513d8289d687fdfbeb6b9894efb0e4b35caf28e8dcc5" exitCode=0 Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762810 2573 generic.go:358] "Generic (PLEG): container finished" podID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerID="598024d0299ebdec827a4f51e15a3b321411dfe99f79c101e8c8fa7f93a49984" exitCode=0 Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"293f2e0c2e09fbd47008513d8289d687fdfbeb6b9894efb0e4b35caf28e8dcc5"} Apr 17 17:26:54.762911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.762910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"598024d0299ebdec827a4f51e15a3b321411dfe99f79c101e8c8fa7f93a49984"} Apr 17 17:26:54.802811 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.802789 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:54.870945 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.870919 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.870963 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.870981 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.870998 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871037 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871068 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871114 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871105 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871135 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gmh5\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871198 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871216 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871243 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871265 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871289 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871310 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871332 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file\") pod \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\" (UID: \"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb\") " Apr 17 17:26:54.871522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871419 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:54.872044 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.871539 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.872951 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.872921 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:54.873090 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.872971 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:54.873090 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.873057 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:54.873729 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.873699 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:54.873825 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.873792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config" (OuterVolumeSpecName: "config") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.875558 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.875462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:54.875558 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.875472 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.875558 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.875518 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.875744 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.875564 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:54.876334 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.876308 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.876567 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.876312 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out" (OuterVolumeSpecName: "config-out") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:54.876567 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.876374 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5" (OuterVolumeSpecName: "kube-api-access-6gmh5") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "kube-api-access-6gmh5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:54.876677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.876430 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.876677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.876503 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.877200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.877179 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.877508 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.877481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.884883 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.884860 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config" (OuterVolumeSpecName: "web-config") pod "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" (UID: "cdabd6a2-fd3b-4b45-9884-50f4cedac9eb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:54.972382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972330 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-metrics-client-ca\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972375 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-metrics-client-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972402 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-grpc-tls\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972414 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972423 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-tls-assets\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972432 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972441 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972450 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972459 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972468 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972477 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972485 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-config-out\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972493 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-web-config\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972501 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gmh5\" (UniqueName: \"kubernetes.io/projected/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-kube-api-access-6gmh5\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972512 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972520 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-prometheus-k8s-db\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:54.972528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:54.972529 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb-secret-kube-rbac-proxy\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:26:55.767626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.767584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdabd6a2-fd3b-4b45-9884-50f4cedac9eb","Type":"ContainerDied","Data":"121fade55067406ef398d267181eb7d74d3096eb5a04f9f5c2e1b761a4f116ae"} Apr 17 17:26:55.767626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.767626 2573 scope.go:117] "RemoveContainer" containerID="89727b57dc17340c4c2a362d59a4b875bc72746434717eafc23b229468adb487" Apr 17 17:26:55.768089 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.767694 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.774490 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.774471 2573 scope.go:117] "RemoveContainer" containerID="9fc62a2c246bebac2935fb24bd863cc36df724d93b7e6d30a27c3cc312bdf4d7" Apr 17 17:26:55.781052 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.781037 2573 scope.go:117] "RemoveContainer" containerID="b62b309f25b53cf9c6149751aad6925701fe4fc236e0259a0cd515bfbf477fc9" Apr 17 17:26:55.786603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.786587 2573 scope.go:117] "RemoveContainer" containerID="660cf3ca5b750381abb2fcd7f7ff928d4913172d054968ca2f499fd3da7702be" Apr 17 17:26:55.788456 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.788432 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:55.793128 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.793108 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:55.793381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.793365 2573 scope.go:117] "RemoveContainer" containerID="293f2e0c2e09fbd47008513d8289d687fdfbeb6b9894efb0e4b35caf28e8dcc5" Apr 17 17:26:55.799055 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.799034 2573 scope.go:117] "RemoveContainer" containerID="598024d0299ebdec827a4f51e15a3b321411dfe99f79c101e8c8fa7f93a49984" Apr 17 17:26:55.805093 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.805078 2573 scope.go:117] "RemoveContainer" containerID="05a8bfdd9b407639b64471d7c6eb4afd6b77ee3100d9ddec7759a1b9b62db06c" Apr 17 17:26:55.818471 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818450 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:55.818756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818743 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="config-reloader" Apr 17 17:26:55.818797 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818759 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="config-reloader" Apr 17 17:26:55.818797 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818774 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="init-config-reloader" Apr 17 17:26:55.818797 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818783 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="init-config-reloader" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818796 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-thanos" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818806 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-thanos" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818820 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" containerName="registry" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818828 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" containerName="registry" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818840 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818848 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818863 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-web" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818871 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-web" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818880 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="thanos-sidecar" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818888 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="thanos-sidecar" Apr 17 17:26:55.818899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818898 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="prometheus" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818908 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="prometheus" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818968 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-thanos" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818981 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="prometheus" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818990 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.818999 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b2224cd-10b0-4bf7-bb36-af52cc8d4236" containerName="registry" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.819009 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="kube-rbac-proxy-web" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.819043 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="config-reloader" Apr 17 17:26:55.819217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.819052 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" containerName="thanos-sidecar" Apr 17 17:26:55.824065 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.824048 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.826805 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.826780 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:26:55.826909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.826843 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:26:55.827034 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.826998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:26:55.827034 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827010 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cf8lnue8b2bhd\"" Apr 17 17:26:55.827158 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827060 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:26:55.827158 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827115 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:26:55.827158 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827119 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dsmsg\"" Apr 17 17:26:55.827284 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827158 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:26:55.827284 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827174 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:26:55.827542 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827519 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:26:55.827592 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827520 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:26:55.827776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827757 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:26:55.827867 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.827779 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:26:55.829475 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.829457 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:26:55.832561 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.832540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:26:55.836107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.836087 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:55.978374 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978523 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr6h9\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-kube-api-access-fr6h9\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978684 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:55.978907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:55.978815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.079889 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.079863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.079993 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.079894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.079993 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.079914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.079993 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.079936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr6h9\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-kube-api-access-fr6h9\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080447 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080779 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080779 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080779 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.080989 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.080969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.081121 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.081098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.081237 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.081219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.081362 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.081340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.082081 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.082057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.084071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.083161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.084071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.083476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.084071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.083517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.084071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.083636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.084595 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.084573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085334 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085334 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085535 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.085709 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-config\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.086031 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.085993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.091031 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.091002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr6h9\" (UniqueName: \"kubernetes.io/projected/7e87163c-fc87-4aa7-b0f1-a92c13c69a22-kube-api-access-fr6h9\") pod \"prometheus-k8s-0\" (UID: \"7e87163c-fc87-4aa7-b0f1-a92c13c69a22\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.134097 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.134075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:56.282683 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.282661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:56.284620 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:26:56.284590 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e87163c_fc87_4aa7_b0f1_a92c13c69a22.slice/crio-734b946f0841b664ffa0ab89ac24c0a487ce86ef2f78c29787d46d1af65ccb31 WatchSource:0}: Error finding container 734b946f0841b664ffa0ab89ac24c0a487ce86ef2f78c29787d46d1af65ccb31: Status 404 returned error can't find the container with id 734b946f0841b664ffa0ab89ac24c0a487ce86ef2f78c29787d46d1af65ccb31 Apr 17 17:26:56.772532 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.772499 2573 generic.go:358] "Generic (PLEG): container finished" podID="7e87163c-fc87-4aa7-b0f1-a92c13c69a22" containerID="1bd7451e7ea9603d1a43570cc35a77ba5ef94d76d56cd432f2185176d056a8ba" exitCode=0 Apr 17 17:26:56.772860 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.772570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerDied","Data":"1bd7451e7ea9603d1a43570cc35a77ba5ef94d76d56cd432f2185176d056a8ba"} Apr 17 17:26:56.772860 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:56.772588 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"734b946f0841b664ffa0ab89ac24c0a487ce86ef2f78c29787d46d1af65ccb31"} Apr 17 17:26:57.195938 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.195908 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdabd6a2-fd3b-4b45-9884-50f4cedac9eb" path="/var/lib/kubelet/pods/cdabd6a2-fd3b-4b45-9884-50f4cedac9eb/volumes" Apr 17 17:26:57.778842 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"d3416f318875c97e7e1ec4ab739f2552e66322d7a1958c32597b5b0936f88773"} Apr 17 17:26:57.779213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"616f391f3b5b106e115531151712e002d30edc9c30038e7bf9f89ca64ad2af93"} Apr 17 17:26:57.779213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"d60db8f044e2bad35ae34dd6974381947e085a862192de70b781183db586d9c1"} Apr 17 17:26:57.779213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"506a746f27dc24d303693f49209e030ef3667ce2b7fde0491b223446dba8d744"} Apr 17 17:26:57.779213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"7f55b767715760262df18121b3a0f384df2e4ae0dba8ae512c65add1293e4133"} Apr 17 17:26:57.779213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.778900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e87163c-fc87-4aa7-b0f1-a92c13c69a22","Type":"ContainerStarted","Data":"34dd147b1f7322d4b463b7c0a6cc7207425cca9039b3c4995870d40612b99b30"} Apr 17 17:26:57.814778 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:26:57.814731 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.814698124 podStartE2EDuration="2.814698124s" podCreationTimestamp="2026-04-17 17:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:57.813407094 +0000 UTC m=+163.129488972" watchObservedRunningTime="2026-04-17 17:26:57.814698124 +0000 UTC m=+163.130779965" Apr 17 17:27:01.134460 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:27:01.134419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:56.135281 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:27:56.135249 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:56.149994 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:27:56.149959 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:56.952312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:27:56.952284 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:15.090942 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:29:15.090912 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:30:53.273218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.273182 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx"] Apr 17 17:30:53.276487 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.276470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.278786 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.278764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:30:53.279705 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.279688 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:30:53.279773 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.279693 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-dffjj\"" Apr 17 17:30:53.285150 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.285130 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx"] Apr 17 17:30:53.410955 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.410926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b3d3a9b-a10b-4570-b17c-6583411b2763-tmp\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.411118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.410958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ft5\" (UniqueName: \"kubernetes.io/projected/1b3d3a9b-a10b-4570-b17c-6583411b2763-kube-api-access-m8ft5\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.511875 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.511847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b3d3a9b-a10b-4570-b17c-6583411b2763-tmp\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.512045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.511880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ft5\" (UniqueName: \"kubernetes.io/projected/1b3d3a9b-a10b-4570-b17c-6583411b2763-kube-api-access-m8ft5\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.512292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.512271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b3d3a9b-a10b-4570-b17c-6583411b2763-tmp\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.520189 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.520168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ft5\" (UniqueName: \"kubernetes.io/projected/1b3d3a9b-a10b-4570-b17c-6583411b2763-kube-api-access-m8ft5\") pod \"openshift-lws-operator-bfc7f696d-r5kkx\" (UID: \"1b3d3a9b-a10b-4570-b17c-6583411b2763\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.585300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.585243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" Apr 17 17:30:53.700083 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.700046 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx"] Apr 17 17:30:53.703045 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:30:53.703007 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b3d3a9b_a10b_4570_b17c_6583411b2763.slice/crio-1360bf8601e5854e7ca351ac43dcc7e83ecc9d34503097148707d00c88330671 WatchSource:0}: Error finding container 1360bf8601e5854e7ca351ac43dcc7e83ecc9d34503097148707d00c88330671: Status 404 returned error can't find the container with id 1360bf8601e5854e7ca351ac43dcc7e83ecc9d34503097148707d00c88330671 Apr 17 17:30:53.704301 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:53.704285 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:30:54.400132 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:54.400099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" event={"ID":"1b3d3a9b-a10b-4570-b17c-6583411b2763","Type":"ContainerStarted","Data":"1360bf8601e5854e7ca351ac43dcc7e83ecc9d34503097148707d00c88330671"} Apr 17 17:30:58.412938 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:58.412906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" event={"ID":"1b3d3a9b-a10b-4570-b17c-6583411b2763","Type":"ContainerStarted","Data":"9934742914f677aa0cc00725b2b13a6b0511bb739bf0b476ec33c58567e3fd54"} Apr 17 17:30:58.437907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:30:58.437865 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-r5kkx" podStartSLOduration=1.2529398440000001 podStartE2EDuration="5.437852001s" podCreationTimestamp="2026-04-17 17:30:53 +0000 UTC" firstStartedPulling="2026-04-17 17:30:53.704403131 +0000 UTC m=+399.020484951" lastFinishedPulling="2026-04-17 17:30:57.889315289 +0000 UTC m=+403.205397108" observedRunningTime="2026-04-17 17:30:58.43747457 +0000 UTC m=+403.753556413" watchObservedRunningTime="2026-04-17 17:30:58.437852001 +0000 UTC m=+403.753933842" Apr 17 17:31:18.704127 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.704089 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f"] Apr 17 17:31:18.707198 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.707179 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.712389 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.712362 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:31:18.712511 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.712368 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:31:18.713125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.713102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:31:18.713751 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.713559 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sjwd4\"" Apr 17 17:31:18.734728 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.734708 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f"] Apr 17 17:31:18.897214 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.897184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dh7\" (UniqueName: \"kubernetes.io/projected/df9c0da7-c1ce-42f4-a69f-89489de90e86-kube-api-access-44dh7\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.897214 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.897218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.897440 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.897243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/df9c0da7-c1ce-42f4-a69f-89489de90e86-manager-config\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.897440 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.897293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.998111 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.998032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44dh7\" (UniqueName: \"kubernetes.io/projected/df9c0da7-c1ce-42f4-a69f-89489de90e86-kube-api-access-44dh7\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.998111 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.998073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.998287 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.998217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/df9c0da7-c1ce-42f4-a69f-89489de90e86-manager-config\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.998287 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.998255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:18.998776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:18.998751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/df9c0da7-c1ce-42f4-a69f-89489de90e86-manager-config\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:19.000476 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.000437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:19.000603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.000581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df9c0da7-c1ce-42f4-a69f-89489de90e86-cert\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:19.096430 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.096393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dh7\" (UniqueName: \"kubernetes.io/projected/df9c0da7-c1ce-42f4-a69f-89489de90e86-kube-api-access-44dh7\") pod \"lws-controller-manager-6dd684f56d-spb9f\" (UID: \"df9c0da7-c1ce-42f4-a69f-89489de90e86\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:19.316210 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.316115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:19.446849 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.446818 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f"] Apr 17 17:31:19.449685 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:31:19.449656 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf9c0da7_c1ce_42f4_a69f_89489de90e86.slice/crio-ddd746f99b659cbc69c971747b72709073a34d3359aa3721fc309f875bbf9615 WatchSource:0}: Error finding container ddd746f99b659cbc69c971747b72709073a34d3359aa3721fc309f875bbf9615: Status 404 returned error can't find the container with id ddd746f99b659cbc69c971747b72709073a34d3359aa3721fc309f875bbf9615 Apr 17 17:31:19.467204 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:19.467177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" event={"ID":"df9c0da7-c1ce-42f4-a69f-89489de90e86","Type":"ContainerStarted","Data":"ddd746f99b659cbc69c971747b72709073a34d3359aa3721fc309f875bbf9615"} Apr 17 17:31:21.473885 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:21.473848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" event={"ID":"df9c0da7-c1ce-42f4-a69f-89489de90e86","Type":"ContainerStarted","Data":"f5492acd24eff74254933f0edc62e5cc1d1aaa54b95068de9a47df0ad0b4e0ae"} Apr 17 17:31:21.474336 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:21.473981 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:31:21.501870 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:21.501828 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" podStartSLOduration=2.057475122 podStartE2EDuration="3.501816816s" podCreationTimestamp="2026-04-17 17:31:18 +0000 UTC" firstStartedPulling="2026-04-17 17:31:19.451693556 +0000 UTC m=+424.767775377" lastFinishedPulling="2026-04-17 17:31:20.896035242 +0000 UTC m=+426.212117071" observedRunningTime="2026-04-17 17:31:21.499994077 +0000 UTC m=+426.816075918" watchObservedRunningTime="2026-04-17 17:31:21.501816816 +0000 UTC m=+426.817898659" Apr 17 17:31:32.478446 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:31:32.478416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-spb9f" Apr 17 17:32:07.054473 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.054443 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs"] Apr 17 17:32:07.057461 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.057442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.060119 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.060098 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-5gl2j\"" Apr 17 17:32:07.060330 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.060312 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:32:07.060551 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.060536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:32:07.068897 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.068868 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs"] Apr 17 17:32:07.137705 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.137669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9d29c98-87be-4bb4-9834-786686ba286d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.137882 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.137782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msffj\" (UniqueName: \"kubernetes.io/projected/d9d29c98-87be-4bb4-9834-786686ba286d-kube-api-access-msffj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.238934 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.238900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msffj\" (UniqueName: \"kubernetes.io/projected/d9d29c98-87be-4bb4-9834-786686ba286d-kube-api-access-msffj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.239125 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.238962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9d29c98-87be-4bb4-9834-786686ba286d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.239880 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.239860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9d29c98-87be-4bb4-9834-786686ba286d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.251363 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.251339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msffj\" (UniqueName: \"kubernetes.io/projected/d9d29c98-87be-4bb4-9834-786686ba286d-kube-api-access-msffj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-5rwgs\" (UID: \"d9d29c98-87be-4bb4-9834-786686ba286d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.367190 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.367168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:07.486538 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.486507 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs"] Apr 17 17:32:07.489809 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:32:07.489781 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d29c98_87be_4bb4_9834_786686ba286d.slice/crio-46b48f4dff666269a2e983150addaf85c399da4cb0110941254726d6c7994091 WatchSource:0}: Error finding container 46b48f4dff666269a2e983150addaf85c399da4cb0110941254726d6c7994091: Status 404 returned error can't find the container with id 46b48f4dff666269a2e983150addaf85c399da4cb0110941254726d6c7994091 Apr 17 17:32:07.599876 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:07.599844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" event={"ID":"d9d29c98-87be-4bb4-9834-786686ba286d","Type":"ContainerStarted","Data":"46b48f4dff666269a2e983150addaf85c399da4cb0110941254726d6c7994091"} Apr 17 17:32:11.222713 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.222653 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz"] Apr 17 17:32:11.226230 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.226210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:11.228637 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.228612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-gzd24\"" Apr 17 17:32:11.228738 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.228612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 17:32:11.237790 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.237754 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz"] Apr 17 17:32:11.273103 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.273066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2p7\" (UniqueName: \"kubernetes.io/projected/54a34ad5-8be6-4d23-ab29-da07334381ed-kube-api-access-9n2p7\") pod \"dns-operator-controller-manager-844548ff4c-czcpz\" (UID: \"54a34ad5-8be6-4d23-ab29-da07334381ed\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:11.373728 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.373694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2p7\" (UniqueName: \"kubernetes.io/projected/54a34ad5-8be6-4d23-ab29-da07334381ed-kube-api-access-9n2p7\") pod \"dns-operator-controller-manager-844548ff4c-czcpz\" (UID: \"54a34ad5-8be6-4d23-ab29-da07334381ed\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:11.384426 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.384392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2p7\" (UniqueName: \"kubernetes.io/projected/54a34ad5-8be6-4d23-ab29-da07334381ed-kube-api-access-9n2p7\") pod \"dns-operator-controller-manager-844548ff4c-czcpz\" (UID: \"54a34ad5-8be6-4d23-ab29-da07334381ed\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:11.538375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.538299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:11.660198 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:11.660180 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz"] Apr 17 17:32:11.663038 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:32:11.662998 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a34ad5_8be6_4d23_ab29_da07334381ed.slice/crio-2db6d42e300b35029541becce76369055d98b354003ff226aed0744a9486035d WatchSource:0}: Error finding container 2db6d42e300b35029541becce76369055d98b354003ff226aed0744a9486035d: Status 404 returned error can't find the container with id 2db6d42e300b35029541becce76369055d98b354003ff226aed0744a9486035d Apr 17 17:32:12.616068 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:12.616009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" event={"ID":"54a34ad5-8be6-4d23-ab29-da07334381ed","Type":"ContainerStarted","Data":"2db6d42e300b35029541becce76369055d98b354003ff226aed0744a9486035d"} Apr 17 17:32:15.626363 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:15.626332 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" event={"ID":"d9d29c98-87be-4bb4-9834-786686ba286d","Type":"ContainerStarted","Data":"a44352eed415f6665a3e744b53e18e0b49c3e18c7fdfe35c750e803182b1d78c"} Apr 17 17:32:15.626725 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:15.626539 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:15.647411 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:15.646856 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" podStartSLOduration=0.996263515 podStartE2EDuration="8.646838676s" podCreationTimestamp="2026-04-17 17:32:07 +0000 UTC" firstStartedPulling="2026-04-17 17:32:07.492008488 +0000 UTC m=+472.808090309" lastFinishedPulling="2026-04-17 17:32:15.142583639 +0000 UTC m=+480.458665470" observedRunningTime="2026-04-17 17:32:15.645828582 +0000 UTC m=+480.961910424" watchObservedRunningTime="2026-04-17 17:32:15.646838676 +0000 UTC m=+480.962920519" Apr 17 17:32:18.343520 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:18.343495 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 17:32:18.637144 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:18.637107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" event={"ID":"54a34ad5-8be6-4d23-ab29-da07334381ed","Type":"ContainerStarted","Data":"eae68ede6974918073cade4f445f5b027c092116827367bbdd56bfb63a487358"} Apr 17 17:32:18.637292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:18.637259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:32:18.666504 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:18.666462 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" podStartSLOduration=0.993035086 podStartE2EDuration="7.666451033s" podCreationTimestamp="2026-04-17 17:32:11 +0000 UTC" firstStartedPulling="2026-04-17 17:32:11.665779294 +0000 UTC m=+476.981861114" lastFinishedPulling="2026-04-17 17:32:18.339195241 +0000 UTC m=+483.655277061" observedRunningTime="2026-04-17 17:32:18.664412529 +0000 UTC m=+483.980494371" watchObservedRunningTime="2026-04-17 17:32:18.666451033 +0000 UTC m=+483.982532874" Apr 17 17:32:26.632351 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:26.632320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-5rwgs" Apr 17 17:32:29.642180 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:32:29.642132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-czcpz" Apr 17 17:33:02.403983 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.403948 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-2qgzn"] Apr 17 17:33:02.412715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.412691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.415682 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.415645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bblll\"" Apr 17 17:33:02.415682 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.415645 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:33:02.416300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.416274 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-2qgzn"] Apr 17 17:33:02.434663 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.434638 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-2qgzn"] Apr 17 17:33:02.453792 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.453758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/432af95c-352c-46b0-918c-f2d2a5abd0c4-config-file\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.453953 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.453826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrsf\" (UniqueName: \"kubernetes.io/projected/432af95c-352c-46b0-918c-f2d2a5abd0c4-kube-api-access-lgrsf\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.554709 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.554672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/432af95c-352c-46b0-918c-f2d2a5abd0c4-config-file\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.554883 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.554734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrsf\" (UniqueName: \"kubernetes.io/projected/432af95c-352c-46b0-918c-f2d2a5abd0c4-kube-api-access-lgrsf\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.555402 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.555381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/432af95c-352c-46b0-918c-f2d2a5abd0c4-config-file\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.568100 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.568073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrsf\" (UniqueName: \"kubernetes.io/projected/432af95c-352c-46b0-918c-f2d2a5abd0c4-kube-api-access-lgrsf\") pod \"limitador-limitador-67566c68b4-2qgzn\" (UID: \"432af95c-352c-46b0-918c-f2d2a5abd0c4\") " pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.724828 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.724755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:02.841619 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:02.841594 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-2qgzn"] Apr 17 17:33:02.844315 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:33:02.844285 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432af95c_352c_46b0_918c_f2d2a5abd0c4.slice/crio-8ce6d7dfb73da28d8267618edd1e0da8f01fd4df6700f57d253a892b548321b6 WatchSource:0}: Error finding container 8ce6d7dfb73da28d8267618edd1e0da8f01fd4df6700f57d253a892b548321b6: Status 404 returned error can't find the container with id 8ce6d7dfb73da28d8267618edd1e0da8f01fd4df6700f57d253a892b548321b6 Apr 17 17:33:03.768816 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:03.768782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" event={"ID":"432af95c-352c-46b0-918c-f2d2a5abd0c4","Type":"ContainerStarted","Data":"8ce6d7dfb73da28d8267618edd1e0da8f01fd4df6700f57d253a892b548321b6"} Apr 17 17:33:09.787954 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:09.787876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" event={"ID":"432af95c-352c-46b0-918c-f2d2a5abd0c4","Type":"ContainerStarted","Data":"1b8de4f72128f348ec583ec358c4e2869b10ed3a9dfd48e54f18febb7070b428"} Apr 17 17:33:09.787954 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:09.787933 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:33:09.807158 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:09.807109 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" podStartSLOduration=1.238938327 podStartE2EDuration="7.807097169s" podCreationTimestamp="2026-04-17 17:33:02 +0000 UTC" firstStartedPulling="2026-04-17 17:33:02.846121172 +0000 UTC m=+528.162202993" lastFinishedPulling="2026-04-17 17:33:09.41428001 +0000 UTC m=+534.730361835" observedRunningTime="2026-04-17 17:33:09.805786243 +0000 UTC m=+535.121868086" watchObservedRunningTime="2026-04-17 17:33:09.807097169 +0000 UTC m=+535.123179010" Apr 17 17:33:20.794473 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:33:20.794437 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-2qgzn" Apr 17 17:36:05.024761 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.024687 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:05.027900 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.027884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.030936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.030912 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:36:05.031069 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.030936 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 17 17:36:05.032139 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.032123 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:36:05.036725 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.036705 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:36:05.055169 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.055145 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:05.089732 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.089839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzfc\" (UniqueName: \"kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.089839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.089839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.089839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.089996 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.089873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191061 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191152 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191152 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191152 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191274 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191274 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191216 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzfc\" (UniqueName: \"kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191475 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.191612 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.191594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.193739 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.193718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.193826 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.193727 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.200173 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.200153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzfc\" (UniqueName: \"kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc\") pod \"scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.338151 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.338073 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:05.458947 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.458916 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:05.462134 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:36:05.462106 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf19b325b_4ac5_43a4_9c9b_1c3d55547cbc.slice/crio-e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61 WatchSource:0}: Error finding container e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61: Status 404 returned error can't find the container with id e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61 Apr 17 17:36:05.463960 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:05.463943 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:36:06.305804 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:06.305744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerStarted","Data":"e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61"} Apr 17 17:36:09.317216 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:09.317181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerStarted","Data":"d24476a5e5c2b8556e2ab2d5bb8c42bd153136f3ae223afa2646ba38dceed3e0"} Apr 17 17:36:13.333721 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:13.333688 2573 generic.go:358] "Generic (PLEG): container finished" podID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerID="d24476a5e5c2b8556e2ab2d5bb8c42bd153136f3ae223afa2646ba38dceed3e0" exitCode=0 Apr 17 17:36:13.334126 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:13.333756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerDied","Data":"d24476a5e5c2b8556e2ab2d5bb8c42bd153136f3ae223afa2646ba38dceed3e0"} Apr 17 17:36:15.341982 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:15.341946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerStarted","Data":"172c283cac429fb75742206ee5f5d9cf21f3926945c3bd9b11c4d3c8fa3bb305"} Apr 17 17:36:15.363148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:15.363104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" podStartSLOduration=2.382613236 podStartE2EDuration="11.363091412s" podCreationTimestamp="2026-04-17 17:36:04 +0000 UTC" firstStartedPulling="2026-04-17 17:36:05.464147759 +0000 UTC m=+710.780229586" lastFinishedPulling="2026-04-17 17:36:14.444625939 +0000 UTC m=+719.760707762" observedRunningTime="2026-04-17 17:36:15.36218803 +0000 UTC m=+720.678269884" watchObservedRunningTime="2026-04-17 17:36:15.363091412 +0000 UTC m=+720.679173253" Apr 17 17:36:25.338600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:25.338567 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:25.339183 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:25.338611 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:25.350681 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:25.350656 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:25.382752 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:25.382722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:57.240649 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.240613 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:57.241141 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.240925 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="main" containerID="cri-o://172c283cac429fb75742206ee5f5d9cf21f3926945c3bd9b11c4d3c8fa3bb305" gracePeriod=30 Apr 17 17:36:57.468145 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.468102 2573 generic.go:358] "Generic (PLEG): container finished" podID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerID="172c283cac429fb75742206ee5f5d9cf21f3926945c3bd9b11c4d3c8fa3bb305" exitCode=0 Apr 17 17:36:57.468260 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.468184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerDied","Data":"172c283cac429fb75742206ee5f5d9cf21f3926945c3bd9b11c4d3c8fa3bb305"} Apr 17 17:36:57.468260 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.468220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" event={"ID":"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc","Type":"ContainerDied","Data":"e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61"} Apr 17 17:36:57.468260 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.468231 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9d9c607a54075fcda9577dc70fd2c33e155f484034120590cba6f5fa7783a61" Apr 17 17:36:57.478302 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.478283 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:57.485443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485424 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.485555 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485467 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdzfc\" (UniqueName: \"kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.485555 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485495 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.485555 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485535 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.485694 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485559 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.485864 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485838 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home" (OuterVolumeSpecName: "home") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.485960 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.485866 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache" (OuterVolumeSpecName: "model-cache") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.487607 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.487586 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc" (OuterVolumeSpecName: "kube-api-access-fdzfc") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "kube-api-access-fdzfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:57.487681 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.487617 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:36:57.487727 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.487686 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm" (OuterVolumeSpecName: "dshm") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.585938 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.585911 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location\") pod \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\" (UID: \"f19b325b-4ac5-43a4-9c9b-1c3d55547cbc\") " Apr 17 17:36:57.586076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.586064 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.586128 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.586080 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdzfc\" (UniqueName: \"kubernetes.io/projected/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kube-api-access-fdzfc\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.586128 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.586091 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.586128 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.586100 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.586128 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.586108 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.639150 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.639121 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" (UID: "f19b325b-4ac5-43a4-9c9b-1c3d55547cbc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.686380 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:57.686359 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.471144 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:58.471113 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d" Apr 17 17:36:58.493310 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:58.493253 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:58.497283 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:58.497234 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5df9d587fc-xv27d"] Apr 17 17:36:59.196292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:36:59.196260 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" path="/var/lib/kubelet/pods/f19b325b-4ac5-43a4-9c9b-1c3d55547cbc/volumes" Apr 17 17:37:06.810890 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.810858 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:37:06.811351 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.811313 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="storage-initializer" Apr 17 17:37:06.811351 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.811333 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="storage-initializer" Apr 17 17:37:06.811351 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.811346 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="main" Apr 17 17:37:06.811508 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.811354 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="main" Apr 17 17:37:06.811508 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.811454 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f19b325b-4ac5-43a4-9c9b-1c3d55547cbc" containerName="main" Apr 17 17:37:06.814306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.814285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.816839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.816813 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 17 17:37:06.817298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.817276 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:37:06.817418 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.817300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:37:06.817418 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.817301 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:37:06.825267 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.825247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:37:06.851456 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.851560 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.851560 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.851560 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.851662 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.851662 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.851610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4srr\" (UniqueName: \"kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.951912 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.951889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.951920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.951940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.951972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.951999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.952033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4srr\" (UniqueName: \"kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952444 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.952423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952514 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.952466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.952613 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.952593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.954131 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.954111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.954200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.954185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:06.959717 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:06.959697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4srr\" (UniqueName: \"kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:07.125915 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:07.125892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:07.252119 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:07.252089 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:37:07.255480 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:37:07.255445 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa50f95e_ff8f_469c_bcbf_5b907b155a1e.slice/crio-d88715e0fc1e9ef9b2ea07b4fcb49016653e4b52d0d62d8bb28123cd1498f6e9 WatchSource:0}: Error finding container d88715e0fc1e9ef9b2ea07b4fcb49016653e4b52d0d62d8bb28123cd1498f6e9: Status 404 returned error can't find the container with id d88715e0fc1e9ef9b2ea07b4fcb49016653e4b52d0d62d8bb28123cd1498f6e9 Apr 17 17:37:07.503873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:07.503793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerStarted","Data":"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c"} Apr 17 17:37:07.503873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:07.503828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerStarted","Data":"d88715e0fc1e9ef9b2ea07b4fcb49016653e4b52d0d62d8bb28123cd1498f6e9"} Apr 17 17:37:11.522532 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:11.522440 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerID="3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c" exitCode=0 Apr 17 17:37:11.522532 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:11.522518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerDied","Data":"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c"} Apr 17 17:37:25.601067 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.601030 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:37:25.630858 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.630823 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:37:25.631045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.630895 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.633586 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.633560 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 17:37:25.722465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk54j\" (UniqueName: \"kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.722644 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.722644 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.722774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.722774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.722774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.722745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.823899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.823859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.823915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.823959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk54j\" (UniqueName: \"kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824289 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824423 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824513 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.824513 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.824482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.826721 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.826693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.826951 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.826931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.833474 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.833438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk54j\" (UniqueName: \"kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j\") pod \"scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:25.942602 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:25.942560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:38.441701 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:38.441625 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:37:38.444862 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:37:38.444834 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba91a22_b726_485f_aa56_a2ced795a51a.slice/crio-f5745e5b72ce5ef9d205bb2843820560f7ee8391285e7441cf7bf9f994671db5 WatchSource:0}: Error finding container f5745e5b72ce5ef9d205bb2843820560f7ee8391285e7441cf7bf9f994671db5: Status 404 returned error can't find the container with id f5745e5b72ce5ef9d205bb2843820560f7ee8391285e7441cf7bf9f994671db5 Apr 17 17:37:38.619402 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:38.619329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerStarted","Data":"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec"} Apr 17 17:37:38.619402 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:38.619402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerStarted","Data":"f5745e5b72ce5ef9d205bb2843820560f7ee8391285e7441cf7bf9f994671db5"} Apr 17 17:37:38.621120 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:38.621093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerStarted","Data":"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f"} Apr 17 17:37:38.660480 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:38.660418 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podStartSLOduration=5.716817906 podStartE2EDuration="32.660399778s" podCreationTimestamp="2026-04-17 17:37:06 +0000 UTC" firstStartedPulling="2026-04-17 17:37:11.523595321 +0000 UTC m=+776.839677144" lastFinishedPulling="2026-04-17 17:37:38.467177196 +0000 UTC m=+803.783259016" observedRunningTime="2026-04-17 17:37:38.65805405 +0000 UTC m=+803.974135894" watchObservedRunningTime="2026-04-17 17:37:38.660399778 +0000 UTC m=+803.976481622" Apr 17 17:37:42.642068 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:42.642009 2573 generic.go:358] "Generic (PLEG): container finished" podID="dba91a22-b726-485f-aa56-a2ced795a51a" containerID="54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec" exitCode=0 Apr 17 17:37:42.642504 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:42.642089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerDied","Data":"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec"} Apr 17 17:37:43.646701 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:43.646662 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerStarted","Data":"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6"} Apr 17 17:37:43.665895 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:43.665848 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" podStartSLOduration=18.665818954 podStartE2EDuration="18.665818954s" podCreationTimestamp="2026-04-17 17:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:37:43.665273505 +0000 UTC m=+808.981355350" watchObservedRunningTime="2026-04-17 17:37:43.665818954 +0000 UTC m=+808.981900795" Apr 17 17:37:45.943223 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:45.943175 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:45.943670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:45.943249 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:45.956573 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:45.956532 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:46.670607 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:46.670577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:47.126330 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:47.126291 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:47.126330 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:47.126324 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:37:47.127827 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:47.127771 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:37:57.126789 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:57.126741 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:37:58.684241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:58.684207 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:37:58.684633 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:58.684586 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="main" containerID="cri-o://1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6" gracePeriod=30 Apr 17 17:37:59.454594 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.454572 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:59.505942 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.505914 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506130 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.505952 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506130 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.505987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506130 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506038 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506130 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506156 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk54j\" (UniqueName: \"kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j\") pod \"dba91a22-b726-485f-aa56-a2ced795a51a\" (UID: \"dba91a22-b726-485f-aa56-a2ced795a51a\") " Apr 17 17:37:59.506300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache" (OuterVolumeSpecName: "model-cache") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:59.506300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506238 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home" (OuterVolumeSpecName: "home") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:59.506495 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506365 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.506495 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.506384 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.508308 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.508283 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j" (OuterVolumeSpecName: "kube-api-access-dk54j") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "kube-api-access-dk54j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:59.508308 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.508289 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm" (OuterVolumeSpecName: "dshm") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:59.508442 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.508303 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:37:59.571673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.571626 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dba91a22-b726-485f-aa56-a2ced795a51a" (UID: "dba91a22-b726-485f-aa56-a2ced795a51a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:59.607076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.607041 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dba91a22-b726-485f-aa56-a2ced795a51a-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.607076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.607069 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dk54j\" (UniqueName: \"kubernetes.io/projected/dba91a22-b726-485f-aa56-a2ced795a51a-kube-api-access-dk54j\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.607226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.607084 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.607226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.607097 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dba91a22-b726-485f-aa56-a2ced795a51a-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:37:59.698118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.698043 2573 generic.go:358] "Generic (PLEG): container finished" podID="dba91a22-b726-485f-aa56-a2ced795a51a" containerID="1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6" exitCode=0 Apr 17 17:37:59.698118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.698090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerDied","Data":"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6"} Apr 17 17:37:59.698118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.698096 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" Apr 17 17:37:59.698587 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.698124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr" event={"ID":"dba91a22-b726-485f-aa56-a2ced795a51a","Type":"ContainerDied","Data":"f5745e5b72ce5ef9d205bb2843820560f7ee8391285e7441cf7bf9f994671db5"} Apr 17 17:37:59.698587 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.698144 2573 scope.go:117] "RemoveContainer" containerID="1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6" Apr 17 17:37:59.716207 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.716155 2573 scope.go:117] "RemoveContainer" containerID="54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec" Apr 17 17:37:59.727295 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.727269 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:37:59.727953 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.727929 2573 scope.go:117] "RemoveContainer" containerID="1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6" Apr 17 17:37:59.728222 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:37:59.728204 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6\": container with ID starting with 1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6 not found: ID does not exist" containerID="1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6" Apr 17 17:37:59.728279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.728232 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6"} err="failed to get container status \"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6\": rpc error: code = NotFound desc = could not find container \"1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6\": container with ID starting with 1bb2f5bb70c05fb51f4ff1e1f60c69c1dfe858f2c561041acec621189db6d1f6 not found: ID does not exist" Apr 17 17:37:59.728279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.728250 2573 scope.go:117] "RemoveContainer" containerID="54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec" Apr 17 17:37:59.728473 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:37:59.728450 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec\": container with ID starting with 54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec not found: ID does not exist" containerID="54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec" Apr 17 17:37:59.728512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.728486 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec"} err="failed to get container status \"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec\": rpc error: code = NotFound desc = could not find container \"54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec\": container with ID starting with 54845592fee160dcb7c43335b939186d17930374704e5ecc5d7915e84e2fefec not found: ID does not exist" Apr 17 17:37:59.732240 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:37:59.732210 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-8698d8565c-d8fhr"] Apr 17 17:38:01.196805 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:01.196774 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" path="/var/lib/kubelet/pods/dba91a22-b726-485f-aa56-a2ced795a51a/volumes" Apr 17 17:38:06.910389 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910354 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:06.910898 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910644 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="main" Apr 17 17:38:06.910898 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910654 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="main" Apr 17 17:38:06.910898 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910672 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="storage-initializer" Apr 17 17:38:06.910898 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910678 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="storage-initializer" Apr 17 17:38:06.910898 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:06.910738 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dba91a22-b726-485f-aa56-a2ced795a51a" containerName="main" Apr 17 17:38:07.127067 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.127006 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:38:07.197695 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.197614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.200338 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.200316 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 17 17:38:07.202345 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.202322 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:07.267740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnmb\" (UniqueName: \"kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.267861 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.267861 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.267861 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.267978 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.267978 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.267951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnmb\" (UniqueName: \"kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.368969 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.368949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.370761 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.370729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.371129 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.371111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.378822 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.378798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnmb\" (UniqueName: \"kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-vrxdt\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.511138 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.511064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:07.842470 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:07.842395 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:07.848754 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:38:07.848719 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d19243_b70a_4beb_b67f_ebc0e8794eb9.slice/crio-33ae28c4ee6288155f28f24f22993797031da4f297a1856a0e9dfb500697a4e9 WatchSource:0}: Error finding container 33ae28c4ee6288155f28f24f22993797031da4f297a1856a0e9dfb500697a4e9: Status 404 returned error can't find the container with id 33ae28c4ee6288155f28f24f22993797031da4f297a1856a0e9dfb500697a4e9 Apr 17 17:38:08.729836 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:08.729803 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerStarted","Data":"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9"} Apr 17 17:38:08.729836 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:08.729839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerStarted","Data":"33ae28c4ee6288155f28f24f22993797031da4f297a1856a0e9dfb500697a4e9"} Apr 17 17:38:12.744191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:12.744147 2573 generic.go:358] "Generic (PLEG): container finished" podID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerID="e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9" exitCode=0 Apr 17 17:38:12.744589 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:12.744214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerDied","Data":"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9"} Apr 17 17:38:13.749789 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:13.749753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerStarted","Data":"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99"} Apr 17 17:38:13.782819 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:13.782741 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" podStartSLOduration=7.782723163 podStartE2EDuration="7.782723163s" podCreationTimestamp="2026-04-17 17:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:13.781245758 +0000 UTC m=+839.097327623" watchObservedRunningTime="2026-04-17 17:38:13.782723163 +0000 UTC m=+839.098805004" Apr 17 17:38:17.126318 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:17.126277 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:38:17.511397 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:17.511311 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:17.511397 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:17.511366 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:17.524113 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:17.524086 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:17.773574 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:17.773499 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:27.127276 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:27.127209 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:38:37.126341 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:37.126296 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:38:39.727545 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:39.727516 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:39.727935 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:39.727801 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="main" containerID="cri-o://52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99" gracePeriod=30 Apr 17 17:38:39.982432 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:39.982377 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:40.144080 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144045 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnmb\" (UniqueName: \"kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144080 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144101 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144123 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144152 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home\") pod \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\" (UID: \"90d19243-b70a-4beb-b67f-ebc0e8794eb9\") " Apr 17 17:38:40.144516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144341 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache" (OuterVolumeSpecName: "model-cache") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.144516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144483 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.144516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.144500 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home" (OuterVolumeSpecName: "home") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.146370 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.146337 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb" (OuterVolumeSpecName: "kube-api-access-nhnmb") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "kube-api-access-nhnmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:40.146595 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.146570 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm" (OuterVolumeSpecName: "dshm") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.146802 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.146779 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:40.199091 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.199050 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90d19243-b70a-4beb-b67f-ebc0e8794eb9" (UID: "90d19243-b70a-4beb-b67f-ebc0e8794eb9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.245756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.245686 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhnmb\" (UniqueName: \"kubernetes.io/projected/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kube-api-access-nhnmb\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.245756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.245714 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.245756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.245724 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.245756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.245735 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90d19243-b70a-4beb-b67f-ebc0e8794eb9-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.245756 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.245745 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90d19243-b70a-4beb-b67f-ebc0e8794eb9-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.835872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.835835 2573 generic.go:358] "Generic (PLEG): container finished" podID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerID="52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99" exitCode=0 Apr 17 17:38:40.836345 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.835912 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" Apr 17 17:38:40.836345 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.835926 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerDied","Data":"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99"} Apr 17 17:38:40.836345 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.835961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt" event={"ID":"90d19243-b70a-4beb-b67f-ebc0e8794eb9","Type":"ContainerDied","Data":"33ae28c4ee6288155f28f24f22993797031da4f297a1856a0e9dfb500697a4e9"} Apr 17 17:38:40.836345 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.835979 2573 scope.go:117] "RemoveContainer" containerID="52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99" Apr 17 17:38:40.847841 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.847120 2573 scope.go:117] "RemoveContainer" containerID="e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9" Apr 17 17:38:40.856893 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.856877 2573 scope.go:117] "RemoveContainer" containerID="52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99" Apr 17 17:38:40.857151 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:38:40.857132 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99\": container with ID starting with 52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99 not found: ID does not exist" containerID="52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99" Apr 17 17:38:40.857229 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.857162 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99"} err="failed to get container status \"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99\": rpc error: code = NotFound desc = could not find container \"52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99\": container with ID starting with 52f4d2cd1a5df7758935ac57de9c6162d4b2978816e965c1882098fbc880dd99 not found: ID does not exist" Apr 17 17:38:40.857229 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.857186 2573 scope.go:117] "RemoveContainer" containerID="e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9" Apr 17 17:38:40.857442 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:38:40.857425 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9\": container with ID starting with e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9 not found: ID does not exist" containerID="e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9" Apr 17 17:38:40.857485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.857449 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9"} err="failed to get container status \"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9\": rpc error: code = NotFound desc = could not find container \"e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9\": container with ID starting with e256c0eb4cffe0a9d0ba9e22560935905ca41d4095f1f731e1f32ff1830e48a9 not found: ID does not exist" Apr 17 17:38:40.862037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.862001 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:40.866676 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:40.866652 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-vrxdt"] Apr 17 17:38:41.196480 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:41.196447 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" path="/var/lib/kubelet/pods/90d19243-b70a-4beb-b67f-ebc0e8794eb9/volumes" Apr 17 17:38:47.126871 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:47.126833 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:38:50.820638 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.820604 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:38:50.821142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.821039 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="main" Apr 17 17:38:50.821142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.821054 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="main" Apr 17 17:38:50.821142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.821067 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="storage-initializer" Apr 17 17:38:50.821142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.821073 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="storage-initializer" Apr 17 17:38:50.821142 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.821129 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="90d19243-b70a-4beb-b67f-ebc0e8794eb9" containerName="main" Apr 17 17:38:50.826546 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.826522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.829005 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.828982 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 17 17:38:50.837903 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.837877 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:38:50.933168 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933123 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.933316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.933316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.933399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.933399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ljs\" (UniqueName: \"kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:50.933399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:50.933389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034067 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034217 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ljs\" (UniqueName: \"kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034429 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034483 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034541 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.034541 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.034525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.036594 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.036572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.036843 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.036827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.043757 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.043733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ljs\" (UniqueName: \"kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs\") pod \"conv-test-round-trip-kserve-68b88fd597-xt9wx\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.140144 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.140110 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:38:51.273403 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.273373 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:38:51.276790 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:38:51.276757 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c WatchSource:0}: Error finding container 26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c: Status 404 returned error can't find the container with id 26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c Apr 17 17:38:51.874141 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.874095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerStarted","Data":"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1"} Apr 17 17:38:51.874141 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:51.874144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerStarted","Data":"26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c"} Apr 17 17:38:55.890316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:55.890281 2573 generic.go:358] "Generic (PLEG): container finished" podID="f80b8344-9646-4361-95a2-a54861e16606" containerID="c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1" exitCode=0 Apr 17 17:38:55.890719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:55.890324 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerDied","Data":"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1"} Apr 17 17:38:56.895782 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:56.895752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerStarted","Data":"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c"} Apr 17 17:38:56.920337 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:56.920281 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" podStartSLOduration=6.920264245 podStartE2EDuration="6.920264245s" podCreationTimestamp="2026-04-17 17:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:56.918094304 +0000 UTC m=+882.234176147" watchObservedRunningTime="2026-04-17 17:38:56.920264245 +0000 UTC m=+882.236346088" Apr 17 17:38:57.127367 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:38:57.127322 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:39:00.711271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.711233 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:39:00.714102 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.714080 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.716714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.716688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 17:39:00.724853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.724833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:39:00.814940 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.814908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.815106 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.814956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.815106 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.814982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.815106 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.815049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t64q\" (UniqueName: \"kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.815106 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.815075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.815106 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.815099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916334 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916524 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916524 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916524 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916524 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t64q\" (UniqueName: \"kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916524 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.916962 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.917113 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.916977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.917113 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.917057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.919271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.919247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.919537 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.919505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:00.924981 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:00.924940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t64q\" (UniqueName: \"kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q\") pod \"stop-feature-test-kserve-769d6cb8-m8jc5\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:01.049623 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.049529 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:01.141394 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.141227 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:39:01.141394 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.141273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:39:01.142683 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.142653 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="main" probeResult="failure" output="Get \"https://10.133.0.24:8000/health\": dial tcp 10.133.0.24:8000: connect: connection refused" Apr 17 17:39:01.187119 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.187086 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:39:01.189619 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:39:01.189587 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33abd26c_a518_4b23_949b_c58b1d006c59.slice/crio-1a446d88a39bc53c3f1495a9dd31afcac05435496bc1fa2c91d57d90513b825e WatchSource:0}: Error finding container 1a446d88a39bc53c3f1495a9dd31afcac05435496bc1fa2c91d57d90513b825e: Status 404 returned error can't find the container with id 1a446d88a39bc53c3f1495a9dd31afcac05435496bc1fa2c91d57d90513b825e Apr 17 17:39:01.913516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.913474 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerStarted","Data":"c600bb1228670832dca8018d549887502bd65dc3c615f26e4b87abc3d5d29624"} Apr 17 17:39:01.913516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:01.913525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerStarted","Data":"1a446d88a39bc53c3f1495a9dd31afcac05435496bc1fa2c91d57d90513b825e"} Apr 17 17:39:03.688566 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:03.688518 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:39:03.689050 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:03.688826 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="main" containerID="cri-o://14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c" gracePeriod=30 Apr 17 17:39:05.929152 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:05.929119 2573 generic.go:358] "Generic (PLEG): container finished" podID="33abd26c-a518-4b23-949b-c58b1d006c59" containerID="c600bb1228670832dca8018d549887502bd65dc3c615f26e4b87abc3d5d29624" exitCode=0 Apr 17 17:39:05.929506 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:05.929202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerDied","Data":"c600bb1228670832dca8018d549887502bd65dc3c615f26e4b87abc3d5d29624"} Apr 17 17:39:06.934809 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:06.934772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerStarted","Data":"8441d18ac5493675ae71f8493216746fbf307330281cec775bb5d7d6a919a2cf"} Apr 17 17:39:06.959689 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:06.959637 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podStartSLOduration=6.959618219 podStartE2EDuration="6.959618219s" podCreationTimestamp="2026-04-17 17:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:39:06.956523115 +0000 UTC m=+892.272604955" watchObservedRunningTime="2026-04-17 17:39:06.959618219 +0000 UTC m=+892.275700063" Apr 17 17:39:07.126702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:07.126655 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" probeResult="failure" output="Get \"https://10.133.0.21:8000/health\": dial tcp 10.133.0.21:8000: connect: connection refused" Apr 17 17:39:11.049814 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:11.049780 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:11.050208 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:11.049939 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:39:11.051669 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:11.051644 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:39:17.146405 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:17.145994 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:39:17.160767 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:17.160735 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:39:21.050241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:21.050196 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:39:26.673710 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:26.673665 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:39:26.674196 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:26.674069 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" containerID="cri-o://f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f" gracePeriod=30 Apr 17 17:39:31.050170 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:31.050124 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:39:32.320303 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.317532 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:39:32.325181 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.325153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.328248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.328075 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 17 17:39:32.330064 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.330039 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:39:32.425399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.425399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.425604 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.425604 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.425703 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.425703 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.425656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn6r\" (UniqueName: \"kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.526732 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.526881 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.526881 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.526881 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn6r\" (UniqueName: \"kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.527074 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.527074 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.526931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.527186 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.527155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.527356 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.527309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.527481 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.527432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.529287 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.529261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.529455 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.529435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.535267 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.535241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn6r\" (UniqueName: \"kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r\") pod \"custom-route-timeout-test-kserve-79c5b74fb-zwv6t\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.639892 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.639857 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:32.769911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:32.769852 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:39:33.034854 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:33.034756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerStarted","Data":"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5"} Apr 17 17:39:33.034854 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:33.034802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerStarted","Data":"a495eb59b0fc5136cfe93954d7f2c8b4efa667878a418d6e42aeeb1542224195"} Apr 17 17:39:33.786882 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:33.786844 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-conmon-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:33.787446 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:33.787406 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-conmon-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:33.787446 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:33.786891 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:33.787634 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:33.786901 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-conmon-14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:33.787801 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:33.787332 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80b8344_9646_4361_95a2_a54861e16606.slice/crio-26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:33.960605 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:33.960569 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-68b88fd597-xt9wx_f80b8344-9646-4361-95a2-a54861e16606/main/0.log" Apr 17 17:39:33.961082 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:33.961060 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:39:34.037919 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.037887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ljs\" (UniqueName: \"kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038102 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.037992 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038102 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038115 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038181 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home\") pod \"f80b8344-9646-4361-95a2-a54861e16606\" (UID: \"f80b8344-9646-4361-95a2-a54861e16606\") " Apr 17 17:39:34.038355 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038259 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache" (OuterVolumeSpecName: "model-cache") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:34.038538 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.038510 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.040660 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.040625 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home" (OuterVolumeSpecName: "home") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:34.041111 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.041068 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:39:34.041111 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.041090 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs" (OuterVolumeSpecName: "kube-api-access-f8ljs") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "kube-api-access-f8ljs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:34.041503 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.041480 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm" (OuterVolumeSpecName: "dshm") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:34.042280 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042263 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-68b88fd597-xt9wx_f80b8344-9646-4361-95a2-a54861e16606/main/0.log" Apr 17 17:39:34.042614 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042590 2573 generic.go:358] "Generic (PLEG): container finished" podID="f80b8344-9646-4361-95a2-a54861e16606" containerID="14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c" exitCode=137 Apr 17 17:39:34.042896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042702 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" Apr 17 17:39:34.042896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerDied","Data":"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c"} Apr 17 17:39:34.042896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx" event={"ID":"f80b8344-9646-4361-95a2-a54861e16606","Type":"ContainerDied","Data":"26ec1ff2065d4e0dccc8a415e7b238c89be244470724dc58512ca239033fec8c"} Apr 17 17:39:34.042896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.042792 2573 scope.go:117] "RemoveContainer" containerID="14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c" Apr 17 17:39:34.056272 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.056246 2573 scope.go:117] "RemoveContainer" containerID="c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1" Apr 17 17:39:34.137908 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.137884 2573 scope.go:117] "RemoveContainer" containerID="14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c" Apr 17 17:39:34.138321 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:34.138293 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c\": container with ID starting with 14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c not found: ID does not exist" containerID="14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c" Apr 17 17:39:34.138398 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.138332 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c"} err="failed to get container status \"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c\": rpc error: code = NotFound desc = could not find container \"14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c\": container with ID starting with 14df65c0580960498c2b018d28d31ab1804450014e66c648e4ffdd3a83a0616c not found: ID does not exist" Apr 17 17:39:34.138398 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.138363 2573 scope.go:117] "RemoveContainer" containerID="c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1" Apr 17 17:39:34.138789 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:34.138728 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1\": container with ID starting with c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1 not found: ID does not exist" containerID="c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1" Apr 17 17:39:34.138886 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.138779 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1"} err="failed to get container status \"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1\": rpc error: code = NotFound desc = could not find container \"c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1\": container with ID starting with c85558facaeedfc6278dbcd1fc722d2e52a6404fe31376bcb8c635c176ffe8a1 not found: ID does not exist" Apr 17 17:39:34.139073 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.139047 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.139073 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.139074 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8ljs\" (UniqueName: \"kubernetes.io/projected/f80b8344-9646-4361-95a2-a54861e16606-kube-api-access-f8ljs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.139245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.139090 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.139245 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.139104 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f80b8344-9646-4361-95a2-a54861e16606-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.139610 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.139577 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f80b8344-9646-4361-95a2-a54861e16606" (UID: "f80b8344-9646-4361-95a2-a54861e16606"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:34.240156 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.240066 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f80b8344-9646-4361-95a2-a54861e16606-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:34.700859 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.700824 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:39:34.708600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:34.708560 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-68b88fd597-xt9wx"] Apr 17 17:39:35.198720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:35.198681 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80b8344-9646-4361-95a2-a54861e16606" path="/var/lib/kubelet/pods/f80b8344-9646-4361-95a2-a54861e16606/volumes" Apr 17 17:39:37.062954 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:37.062866 2573 generic.go:358] "Generic (PLEG): container finished" podID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerID="6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5" exitCode=0 Apr 17 17:39:37.063308 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:37.062943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerDied","Data":"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5"} Apr 17 17:39:38.068220 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:38.068184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerStarted","Data":"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9"} Apr 17 17:39:38.088502 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:38.088447 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podStartSLOduration=6.08843241 podStartE2EDuration="6.08843241s" podCreationTimestamp="2026-04-17 17:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:39:38.087774367 +0000 UTC m=+923.403856210" watchObservedRunningTime="2026-04-17 17:39:38.08843241 +0000 UTC m=+923.404514253" Apr 17 17:39:41.050599 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:41.050550 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:39:42.640098 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:42.640055 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:42.640098 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:42.640107 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:39:42.641909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:42.641878 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:39:51.050465 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:51.050415 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:39:52.640909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:52.640859 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:39:56.928211 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:56.928147 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h_aa50f95e-ff8f-469c-bcbf-5b907b155a1e/main/0.log" Apr 17 17:39:56.928556 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:56.928530 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043180 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043365 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043435 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4srr\" (UniqueName: \"kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.044218 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.043494 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm\") pod \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\" (UID: \"aa50f95e-ff8f-469c-bcbf-5b907b155a1e\") " Apr 17 17:39:57.049799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.049742 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache" (OuterVolumeSpecName: "model-cache") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:57.050044 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.049990 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home" (OuterVolumeSpecName: "home") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:57.051912 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.051878 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:39:57.052807 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.052765 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr" (OuterVolumeSpecName: "kube-api-access-q4srr") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "kube-api-access-q4srr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:57.052914 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.052855 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm" (OuterVolumeSpecName: "dshm") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:57.117742 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.117688 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa50f95e-ff8f-469c-bcbf-5b907b155a1e" (UID: "aa50f95e-ff8f-469c-bcbf-5b907b155a1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:57.138079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h_aa50f95e-ff8f-469c-bcbf-5b907b155a1e/main/0.log" Apr 17 17:39:57.138448 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138424 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerID="f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f" exitCode=137 Apr 17 17:39:57.138557 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerDied","Data":"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f"} Apr 17 17:39:57.138557 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138543 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" Apr 17 17:39:57.138639 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h" event={"ID":"aa50f95e-ff8f-469c-bcbf-5b907b155a1e","Type":"ContainerDied","Data":"d88715e0fc1e9ef9b2ea07b4fcb49016653e4b52d0d62d8bb28123cd1498f6e9"} Apr 17 17:39:57.138639 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.138620 2573 scope.go:117] "RemoveContainer" containerID="f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f" Apr 17 17:39:57.144559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144530 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.144636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144561 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.144636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144570 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.144636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144580 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.144636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144589 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.144636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.144599 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4srr\" (UniqueName: \"kubernetes.io/projected/aa50f95e-ff8f-469c-bcbf-5b907b155a1e-kube-api-access-q4srr\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:39:57.164873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.164847 2573 scope.go:117] "RemoveContainer" containerID="3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c" Apr 17 17:39:57.165238 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.165205 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:39:57.168782 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.168758 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-779b5987b68r64h"] Apr 17 17:39:57.197131 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.197058 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" path="/var/lib/kubelet/pods/aa50f95e-ff8f-469c-bcbf-5b907b155a1e/volumes" Apr 17 17:39:57.226182 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.226157 2573 scope.go:117] "RemoveContainer" containerID="f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f" Apr 17 17:39:57.226497 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:57.226473 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f\": container with ID starting with f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f not found: ID does not exist" containerID="f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f" Apr 17 17:39:57.226559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.226509 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f"} err="failed to get container status \"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f\": rpc error: code = NotFound desc = could not find container \"f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f\": container with ID starting with f04979a773ef75844b51e649e299905d9e1e9ee7ad4032e371803287aa3e340f not found: ID does not exist" Apr 17 17:39:57.226559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.226536 2573 scope.go:117] "RemoveContainer" containerID="3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c" Apr 17 17:39:57.226822 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:39:57.226799 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c\": container with ID starting with 3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c not found: ID does not exist" containerID="3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c" Apr 17 17:39:57.226928 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:39:57.226826 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c"} err="failed to get container status \"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c\": rpc error: code = NotFound desc = could not find container \"3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c\": container with ID starting with 3657018e690dd771bcb64917588c50a20ac4980c4e18d3c1d2ba508def02b14c not found: ID does not exist" Apr 17 17:40:01.050717 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:01.050672 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:40:02.641040 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:02.640979 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:40:11.050899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:11.050844 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:40:12.640429 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:12.640384 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:40:21.050602 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:21.050548 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:40:22.640583 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:22.640531 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:40:31.049915 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:31.049857 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:40:32.641175 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:32.641129 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:40:41.050371 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:41.050329 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" probeResult="failure" output="Get \"https://10.133.0.25:8000/health\": dial tcp 10.133.0.25:8000: connect: connection refused" Apr 17 17:40:42.640267 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:42.640214 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:40:51.059999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:51.059962 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:40:51.067648 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:51.067622 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:40:52.202541 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:52.202505 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:40:52.324559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:52.324493 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" containerID="cri-o://8441d18ac5493675ae71f8493216746fbf307330281cec775bb5d7d6a919a2cf" gracePeriod=30 Apr 17 17:40:52.641002 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:40:52.640968 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:41:02.640870 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:02.640817 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:41:06.421068 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421032 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:41:06.421536 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421506 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" Apr 17 17:41:06.421536 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421526 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421544 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="storage-initializer" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421554 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="storage-initializer" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421568 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="storage-initializer" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421577 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="storage-initializer" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421585 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="main" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421593 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="main" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421656 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f80b8344-9646-4361-95a2-a54861e16606" containerName="main" Apr 17 17:41:06.421680 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.421667 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa50f95e-ff8f-469c-bcbf-5b907b155a1e" containerName="main" Apr 17 17:41:06.426065 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.426042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.434110 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.434083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:41:06.527066 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527001 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.527261 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.527261 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.527261 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.527414 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf44f\" (UniqueName: \"kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.527414 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.527287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf44f\" (UniqueName: \"kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.627974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.627904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.628284 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.628209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.628284 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.628251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.628391 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.628361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.630075 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.630051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.630248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.630230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.636676 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.636654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf44f\" (UniqueName: \"kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f\") pod \"stop-feature-test-kserve-769d6cb8-fm9k2\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.737716 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.737616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:06.867248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.867217 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:41:06.870096 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:41:06.870064 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0313892d_db8d_4f99_8f65_1f01c14256bd.slice/crio-c787fe46a1b6c60e3c06c931d77634dc9e7c590eebc16b3c32cfa1cc984b7024 WatchSource:0}: Error finding container c787fe46a1b6c60e3c06c931d77634dc9e7c590eebc16b3c32cfa1cc984b7024: Status 404 returned error can't find the container with id c787fe46a1b6c60e3c06c931d77634dc9e7c590eebc16b3c32cfa1cc984b7024 Apr 17 17:41:06.872141 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:06.872120 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:41:07.373946 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:07.373907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerStarted","Data":"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d"} Apr 17 17:41:07.373946 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:07.373944 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerStarted","Data":"c787fe46a1b6c60e3c06c931d77634dc9e7c590eebc16b3c32cfa1cc984b7024"} Apr 17 17:41:11.390611 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:11.390575 2573 generic.go:358] "Generic (PLEG): container finished" podID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerID="71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d" exitCode=0 Apr 17 17:41:11.391121 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:11.390655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerDied","Data":"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d"} Apr 17 17:41:12.397379 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:12.397343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerStarted","Data":"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2"} Apr 17 17:41:12.417308 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:12.417268 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podStartSLOduration=6.417255131 podStartE2EDuration="6.417255131s" podCreationTimestamp="2026-04-17 17:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:41:12.416067105 +0000 UTC m=+1017.732148947" watchObservedRunningTime="2026-04-17 17:41:12.417255131 +0000 UTC m=+1017.733336972" Apr 17 17:41:12.641238 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:12.641198 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.26:8000/health\": dial tcp 10.133.0.26:8000: connect: connection refused" Apr 17 17:41:16.738750 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:16.738711 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:16.739268 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:16.738861 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:41:16.740469 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:16.740441 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:41:22.429858 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.429830 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-769d6cb8-m8jc5_33abd26c-a518-4b23-949b-c58b1d006c59/main/0.log" Apr 17 17:41:22.430230 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.430205 2573 generic.go:358] "Generic (PLEG): container finished" podID="33abd26c-a518-4b23-949b-c58b1d006c59" containerID="8441d18ac5493675ae71f8493216746fbf307330281cec775bb5d7d6a919a2cf" exitCode=137 Apr 17 17:41:22.430279 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.430267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerDied","Data":"8441d18ac5493675ae71f8493216746fbf307330281cec775bb5d7d6a919a2cf"} Apr 17 17:41:22.589891 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.589871 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-769d6cb8-m8jc5_33abd26c-a518-4b23-949b-c58b1d006c59/main/0.log" Apr 17 17:41:22.590237 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.590214 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:41:22.650609 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.650582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:41:22.658301 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.658273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:41:22.664655 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664635 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t64q\" (UniqueName: \"kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.664746 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664669 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.664746 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664739 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.664853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.664853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664812 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.664853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.664840 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache\") pod \"33abd26c-a518-4b23-949b-c58b1d006c59\" (UID: \"33abd26c-a518-4b23-949b-c58b1d006c59\") " Apr 17 17:41:22.665553 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.665210 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home" (OuterVolumeSpecName: "home") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:22.665553 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.665534 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache" (OuterVolumeSpecName: "model-cache") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:22.667308 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.667269 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm" (OuterVolumeSpecName: "dshm") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:22.667400 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.667332 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q" (OuterVolumeSpecName: "kube-api-access-9t64q") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "kube-api-access-9t64q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:41:22.667400 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.667332 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:41:22.722229 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.722195 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33abd26c-a518-4b23-949b-c58b1d006c59" (UID: "33abd26c-a518-4b23-949b-c58b1d006c59"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:22.765725 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765697 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33abd26c-a518-4b23-949b-c58b1d006c59-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:22.765725 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765725 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:22.765906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765735 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:22.765906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765744 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:22.765906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765753 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9t64q\" (UniqueName: \"kubernetes.io/projected/33abd26c-a518-4b23-949b-c58b1d006c59-kube-api-access-9t64q\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:22.765906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:22.765761 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/33abd26c-a518-4b23-949b-c58b1d006c59-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:23.434373 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.434346 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-769d6cb8-m8jc5_33abd26c-a518-4b23-949b-c58b1d006c59/main/0.log" Apr 17 17:41:23.434808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.434792 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" Apr 17 17:41:23.434863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.434786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5" event={"ID":"33abd26c-a518-4b23-949b-c58b1d006c59","Type":"ContainerDied","Data":"1a446d88a39bc53c3f1495a9dd31afcac05435496bc1fa2c91d57d90513b825e"} Apr 17 17:41:23.434863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.434842 2573 scope.go:117] "RemoveContainer" containerID="8441d18ac5493675ae71f8493216746fbf307330281cec775bb5d7d6a919a2cf" Apr 17 17:41:23.452932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.452847 2573 scope.go:117] "RemoveContainer" containerID="c600bb1228670832dca8018d549887502bd65dc3c615f26e4b87abc3d5d29624" Apr 17 17:41:23.456065 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.456042 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:41:23.460690 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:23.460667 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-m8jc5"] Apr 17 17:41:25.198105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:25.198064 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" path="/var/lib/kubelet/pods/33abd26c-a518-4b23-949b-c58b1d006c59/volumes" Apr 17 17:41:26.739037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:26.738971 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:41:28.868934 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:28.868895 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:41:28.869492 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:28.869285 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" containerID="cri-o://1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9" gracePeriod=30 Apr 17 17:41:36.738476 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:36.738436 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:41:46.738901 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:46.738854 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:41:47.510738 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.510704 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:41:47.511200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.511183 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" Apr 17 17:41:47.511273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.511202 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" Apr 17 17:41:47.511273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.511215 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="storage-initializer" Apr 17 17:41:47.511273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.511223 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="storage-initializer" Apr 17 17:41:47.511371 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.511288 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="33abd26c-a518-4b23-949b-c58b1d006c59" containerName="main" Apr 17 17:41:47.514528 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.514510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.517570 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.517547 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 17 17:41:47.526727 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.526700 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:41:47.575863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.575820 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.576037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.575875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.576037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.575964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzm7j\" (UniqueName: \"kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.576037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.575994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.576037 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.576025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.576276 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.576052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677352 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzm7j\" (UniqueName: \"kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.677907 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.677877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.678059 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.678043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.678134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.678067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.679902 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.679878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.680145 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.680126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.688446 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.688423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzm7j\" (UniqueName: \"kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j\") pod \"router-with-refs-test-kserve-5588885dd8-fx9rv\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.826713 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.826631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:47.952678 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:47.952648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:41:47.958009 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:41:47.957982 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c43d70_6989_455e_a571_8a88e61448a1.slice/crio-2f881d9f0046ed7621bdd452f8f5982297546d92b6ad18545f043767a456bb8a WatchSource:0}: Error finding container 2f881d9f0046ed7621bdd452f8f5982297546d92b6ad18545f043767a456bb8a: Status 404 returned error can't find the container with id 2f881d9f0046ed7621bdd452f8f5982297546d92b6ad18545f043767a456bb8a Apr 17 17:41:48.523924 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:48.523887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerStarted","Data":"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f"} Apr 17 17:41:48.523924 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:48.523925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerStarted","Data":"2f881d9f0046ed7621bdd452f8f5982297546d92b6ad18545f043767a456bb8a"} Apr 17 17:41:52.538151 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:52.538113 2573 generic.go:358] "Generic (PLEG): container finished" podID="65c43d70-6989-455e-a571-8a88e61448a1" containerID="f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f" exitCode=0 Apr 17 17:41:52.538596 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:52.538191 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerDied","Data":"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f"} Apr 17 17:41:53.542967 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:53.542928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerStarted","Data":"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095"} Apr 17 17:41:53.566346 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:53.566283 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podStartSLOduration=6.566266178 podStartE2EDuration="6.566266178s" podCreationTimestamp="2026-04-17 17:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:41:53.564442452 +0000 UTC m=+1058.880524295" watchObservedRunningTime="2026-04-17 17:41:53.566266178 +0000 UTC m=+1058.882348021" Apr 17 17:41:56.738831 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:56.738782 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:41:57.827003 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:57.826963 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:57.827003 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:57.827006 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:41:57.828787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:57.828754 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:41:59.110372 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.110352 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-79c5b74fb-zwv6t_d31a00e2-ce2a-4b4e-ab86-26eba01505bf/main/0.log" Apr 17 17:41:59.110762 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.110748 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:41:59.281905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.281826 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.281905 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.281887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.282148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.281915 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.282148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.281934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.282148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.281970 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.282148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.282005 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrn6r\" (UniqueName: \"kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r\") pod \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\" (UID: \"d31a00e2-ce2a-4b4e-ab86-26eba01505bf\") " Apr 17 17:41:59.282630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.282515 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home" (OuterVolumeSpecName: "home") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:59.282807 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.282778 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache" (OuterVolumeSpecName: "model-cache") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:59.284627 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.284597 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm" (OuterVolumeSpecName: "dshm") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:59.284737 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.284697 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:41:59.284918 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.284894 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r" (OuterVolumeSpecName: "kube-api-access-qrn6r") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "kube-api-access-qrn6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:41:59.346283 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.346242 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d31a00e2-ce2a-4b4e-ab86-26eba01505bf" (UID: "d31a00e2-ce2a-4b4e-ab86-26eba01505bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:59.383471 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383439 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.383603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383477 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.383603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383491 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.383603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383504 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.383603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383518 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.383603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.383532 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrn6r\" (UniqueName: \"kubernetes.io/projected/d31a00e2-ce2a-4b4e-ab86-26eba01505bf-kube-api-access-qrn6r\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:41:59.563754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.563677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-79c5b74fb-zwv6t_d31a00e2-ce2a-4b4e-ab86-26eba01505bf/main/0.log" Apr 17 17:41:59.564157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.564128 2573 generic.go:358] "Generic (PLEG): container finished" podID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerID="1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9" exitCode=137 Apr 17 17:41:59.564247 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.564219 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" Apr 17 17:41:59.564247 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.564217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerDied","Data":"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9"} Apr 17 17:41:59.564386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.564262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t" event={"ID":"d31a00e2-ce2a-4b4e-ab86-26eba01505bf","Type":"ContainerDied","Data":"a495eb59b0fc5136cfe93954d7f2c8b4efa667878a418d6e42aeeb1542224195"} Apr 17 17:41:59.564386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.564283 2573 scope.go:117] "RemoveContainer" containerID="1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9" Apr 17 17:41:59.583568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.583546 2573 scope.go:117] "RemoveContainer" containerID="6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5" Apr 17 17:41:59.590504 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.590481 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:41:59.595003 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.594825 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-79c5b74fb-zwv6t"] Apr 17 17:41:59.595776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.595754 2573 scope.go:117] "RemoveContainer" containerID="1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9" Apr 17 17:41:59.598443 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:41:59.598413 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9\": container with ID starting with 1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9 not found: ID does not exist" containerID="1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9" Apr 17 17:41:59.598544 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.598448 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9"} err="failed to get container status \"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9\": rpc error: code = NotFound desc = could not find container \"1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9\": container with ID starting with 1167711d3f8b4faea3d3d35f821c1f38d07e253bd15bdecf7d547dcb7a390ba9 not found: ID does not exist" Apr 17 17:41:59.598544 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.598473 2573 scope.go:117] "RemoveContainer" containerID="6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5" Apr 17 17:41:59.599111 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:41:59.599078 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5\": container with ID starting with 6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5 not found: ID does not exist" containerID="6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5" Apr 17 17:41:59.599183 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:41:59.599115 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5"} err="failed to get container status \"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5\": rpc error: code = NotFound desc = could not find container \"6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5\": container with ID starting with 6cbe5de6150889d81e1906c4aef0440c95bcbb909088b19454b7e8fecd4866d5 not found: ID does not exist" Apr 17 17:42:01.196235 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:01.196202 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" path="/var/lib/kubelet/pods/d31a00e2-ce2a-4b4e-ab86-26eba01505bf/volumes" Apr 17 17:42:06.739054 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:06.738920 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:42:07.828116 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:07.828056 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:42:15.174582 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:15.174545 2573 scope.go:117] "RemoveContainer" containerID="172c283cac429fb75742206ee5f5d9cf21f3926945c3bd9b11c4d3c8fa3bb305" Apr 17 17:42:15.182782 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:15.182760 2573 scope.go:117] "RemoveContainer" containerID="d24476a5e5c2b8556e2ab2d5bb8c42bd153136f3ae223afa2646ba38dceed3e0" Apr 17 17:42:16.738395 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:16.738343 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:42:17.827924 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:17.827871 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:42:26.738488 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:26.738438 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:42:27.827888 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:27.827831 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:42:36.738555 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:36.738511 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 17 17:42:37.827040 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:37.826983 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:42:46.748119 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:46.748087 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:42:46.760081 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:46.760060 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:42:47.619215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:47.619183 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:42:47.823535 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:47.823502 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:47.823866 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:47.823582 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:42:48.323562981 +0000 UTC m=+1113.639644803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:47.827546 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:47.827514 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:42:48.328104 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:48.328075 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:48.328263 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:48.328145 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:42:49.328131767 +0000 UTC m=+1114.644213587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:48.732634 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:48.732590 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" containerID="cri-o://578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2" gracePeriod=30 Apr 17 17:42:49.335436 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:49.335409 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:49.335783 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:49.335482 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:42:51.335466804 +0000 UTC m=+1116.651548625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:51.350940 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:51.350858 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:51.350940 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:51.350925 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:42:55.35091074 +0000 UTC m=+1120.666992560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:55.383220 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:55.383194 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:55.383580 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:42:55.383259 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:43:03.383240414 +0000 UTC m=+1128.699322235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:42:57.827156 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:42:57.827114 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:43:03.438223 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:43:03.438189 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:43:03.438565 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:43:03.438264 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs podName:0313892d-db8d-4f99-8f65-1f01c14256bd nodeName:}" failed. No retries permitted until 2026-04-17 17:43:19.438250464 +0000 UTC m=+1144.754332284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs") pod "stop-feature-test-kserve-769d6cb8-fm9k2" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 17:43:07.827435 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:07.827388 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:43:17.828032 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:17.827982 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 17 17:43:19.053774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.053752 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-769d6cb8-fm9k2_0313892d-db8d-4f99-8f65-1f01c14256bd/main/0.log" Apr 17 17:43:19.054159 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.054086 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:43:19.179254 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179210 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf44f\" (UniqueName: \"kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179444 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179273 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179444 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179309 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179444 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179346 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179444 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179410 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179697 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179454 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm\") pod \"0313892d-db8d-4f99-8f65-1f01c14256bd\" (UID: \"0313892d-db8d-4f99-8f65-1f01c14256bd\") " Apr 17 17:43:19.179897 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.179862 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache" (OuterVolumeSpecName: "model-cache") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:19.180295 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.180264 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home" (OuterVolumeSpecName: "home") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:19.182550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.182512 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm" (OuterVolumeSpecName: "dshm") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:19.189145 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.189120 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:19.189270 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.189168 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f" (OuterVolumeSpecName: "kube-api-access-cf44f") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "kube-api-access-cf44f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:19.240010 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.239972 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0313892d-db8d-4f99-8f65-1f01c14256bd" (UID: "0313892d-db8d-4f99-8f65-1f01c14256bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:19.280573 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280538 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cf44f\" (UniqueName: \"kubernetes.io/projected/0313892d-db8d-4f99-8f65-1f01c14256bd-kube-api-access-cf44f\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.280702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280576 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0313892d-db8d-4f99-8f65-1f01c14256bd-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.280702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280595 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.280702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280609 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.280702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280625 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.280702 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.280638 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0313892d-db8d-4f99-8f65-1f01c14256bd-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:43:19.834288 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-769d6cb8-fm9k2_0313892d-db8d-4f99-8f65-1f01c14256bd/main/0.log" Apr 17 17:43:19.834615 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834588 2573 generic.go:358] "Generic (PLEG): container finished" podID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerID="578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2" exitCode=137 Apr 17 17:43:19.834686 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerDied","Data":"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2"} Apr 17 17:43:19.834686 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" event={"ID":"0313892d-db8d-4f99-8f65-1f01c14256bd","Type":"ContainerDied","Data":"c787fe46a1b6c60e3c06c931d77634dc9e7c590eebc16b3c32cfa1cc984b7024"} Apr 17 17:43:19.834686 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834653 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2" Apr 17 17:43:19.834686 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.834665 2573 scope.go:117] "RemoveContainer" containerID="578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2" Apr 17 17:43:19.860849 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.860820 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:43:19.866443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.866417 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-769d6cb8-fm9k2"] Apr 17 17:43:19.866941 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.866921 2573 scope.go:117] "RemoveContainer" containerID="71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d" Apr 17 17:43:19.934275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.934257 2573 scope.go:117] "RemoveContainer" containerID="578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2" Apr 17 17:43:19.934559 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:43:19.934531 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2\": container with ID starting with 578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2 not found: ID does not exist" containerID="578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2" Apr 17 17:43:19.934601 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.934562 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2"} err="failed to get container status \"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2\": rpc error: code = NotFound desc = could not find container \"578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2\": container with ID starting with 578d5b803fbb883de166b2cf178599584b08ec3d2e760389e27fd79ccd783ea2 not found: ID does not exist" Apr 17 17:43:19.934601 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.934582 2573 scope.go:117] "RemoveContainer" containerID="71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d" Apr 17 17:43:19.934805 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:43:19.934789 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d\": container with ID starting with 71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d not found: ID does not exist" containerID="71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d" Apr 17 17:43:19.934855 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:19.934811 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d"} err="failed to get container status \"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d\": rpc error: code = NotFound desc = could not find container \"71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d\": container with ID starting with 71525a1af425a52245f7732ba7e16f99bf6cdcb5b5dbb1b6d5d3a460b6a8939d not found: ID does not exist" Apr 17 17:43:21.196111 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:21.196077 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" path="/var/lib/kubelet/pods/0313892d-db8d-4f99-8f65-1f01c14256bd/volumes" Apr 17 17:43:27.837630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:27.837591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:43:27.845936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:27.845906 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:43:49.332677 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:49.332641 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:43:49.333274 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:43:49.332921 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" containerID="cri-o://c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095" gracePeriod=30 Apr 17 17:44:08.908087 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908048 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908337 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908348 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908357 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="storage-initializer" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908362 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="storage-initializer" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908369 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908375 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908380 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="storage-initializer" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908385 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="storage-initializer" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908434 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d31a00e2-ce2a-4b4e-ab86-26eba01505bf" containerName="main" Apr 17 17:44:08.908550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.908443 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0313892d-db8d-4f99-8f65-1f01c14256bd" containerName="main" Apr 17 17:44:08.913052 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.913034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.915968 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.915953 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 17:44:08.916060 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.916034 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-75wtg\"" Apr 17 17:44:08.923538 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.923512 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:44:08.957918 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.957898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.958064 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.957931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.958064 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.957962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2cx\" (UniqueName: \"kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.958191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.958057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.958191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.958105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.958191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.958169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:08.989071 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.989045 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:44:08.992386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:08.992366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.014690 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.014666 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:44:09.058851 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2cx\" (UniqueName: \"kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.058974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.058974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.058974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.058974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.058974 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.058995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg2tt\" (UniqueName: \"kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.059275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.059494 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.059494 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.059494 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.059477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.061059 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.061042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.061391 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.061373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.068526 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.068503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2cx\" (UniqueName: \"kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.159612 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.159612 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.159733 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.159733 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.159733 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zg2tt\" (UniqueName: \"kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.159877 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.159737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.160468 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.160407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.160468 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.160424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.160674 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.160649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.165687 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.165647 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.165798 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.165781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.169147 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.169071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg2tt\" (UniqueName: \"kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.221610 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.221584 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:09.302544 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.302511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:09.344184 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.344060 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:44:09.348219 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:44:09.348075 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4290424_af92_4f3f_b633_7ff5db8c3d58.slice/crio-de23127e38e87e0e5fa2e598b1b460aafcb12ec5abf88c3758a61720dd32d5c6 WatchSource:0}: Error finding container de23127e38e87e0e5fa2e598b1b460aafcb12ec5abf88c3758a61720dd32d5c6: Status 404 returned error can't find the container with id de23127e38e87e0e5fa2e598b1b460aafcb12ec5abf88c3758a61720dd32d5c6 Apr 17 17:44:09.427603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.427581 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:44:09.429535 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:44:09.429512 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487f6e70_3644_4837_b534_8f962a654798.slice/crio-7c21f333ad65318a34f1423702dd67133939eb4d655d3eaedcf193910655aeda WatchSource:0}: Error finding container 7c21f333ad65318a34f1423702dd67133939eb4d655d3eaedcf193910655aeda: Status 404 returned error can't find the container with id 7c21f333ad65318a34f1423702dd67133939eb4d655d3eaedcf193910655aeda Apr 17 17:44:09.997959 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.997913 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerStarted","Data":"de23127e38e87e0e5fa2e598b1b460aafcb12ec5abf88c3758a61720dd32d5c6"} Apr 17 17:44:09.999738 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.999708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerStarted","Data":"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec"} Apr 17 17:44:09.999866 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:09.999742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerStarted","Data":"7c21f333ad65318a34f1423702dd67133939eb4d655d3eaedcf193910655aeda"} Apr 17 17:44:11.004911 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:11.004875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerStarted","Data":"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39"} Apr 17 17:44:11.005656 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:11.004987 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:12.010802 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:12.010760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerStarted","Data":"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a"} Apr 17 17:44:14.018808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:14.018774 2573 generic.go:358] "Generic (PLEG): container finished" podID="487f6e70-3644-4837-b534-8f962a654798" containerID="897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec" exitCode=0 Apr 17 17:44:14.019185 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:14.018848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerDied","Data":"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec"} Apr 17 17:44:15.024338 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:15.024302 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerStarted","Data":"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe"} Apr 17 17:44:15.047039 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:15.046956 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podStartSLOduration=7.04693471 podStartE2EDuration="7.04693471s" podCreationTimestamp="2026-04-17 17:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:15.044800348 +0000 UTC m=+1200.360882224" watchObservedRunningTime="2026-04-17 17:44:15.04693471 +0000 UTC m=+1200.363016553" Apr 17 17:44:16.029814 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:16.029782 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerID="a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a" exitCode=0 Apr 17 17:44:16.030209 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:16.029847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerDied","Data":"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a"} Apr 17 17:44:17.039977 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:17.039945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerStarted","Data":"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb"} Apr 17 17:44:17.071606 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:17.071544 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podStartSLOduration=8.224943988 podStartE2EDuration="9.071524418s" podCreationTimestamp="2026-04-17 17:44:08 +0000 UTC" firstStartedPulling="2026-04-17 17:44:09.352213281 +0000 UTC m=+1194.668295108" lastFinishedPulling="2026-04-17 17:44:10.198793718 +0000 UTC m=+1195.514875538" observedRunningTime="2026-04-17 17:44:17.070484834 +0000 UTC m=+1202.386566689" watchObservedRunningTime="2026-04-17 17:44:17.071524418 +0000 UTC m=+1202.387606262" Apr 17 17:44:18.792389 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.792352 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:44:18.797327 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.797305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.799982 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.799958 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 17 17:44:18.807399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.807376 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:44:18.847860 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.848029 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.848029 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.848029 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.848029 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.848029 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.847995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vj5r\" (UniqueName: \"kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949157 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949384 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949497 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vj5r\" (UniqueName: \"kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.949561 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.950048 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.950048 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.949780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.950213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.950108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.952083 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.952011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.952305 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.952282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:18.963826 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:18.963797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vj5r\" (UniqueName: \"kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:19.110559 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.110456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:19.222036 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.221976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:19.222207 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.222088 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:19.223164 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.223069 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:44:19.252280 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.251902 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:44:19.254489 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:44:19.254460 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772f98d2_3f0e_4fc5_a59b_75a7337b9299.slice/crio-407d75b4d588e79d1e6151b9007edc8f0d8d4164a902e1f533f60dd23a4fbb0c WatchSource:0}: Error finding container 407d75b4d588e79d1e6151b9007edc8f0d8d4164a902e1f533f60dd23a4fbb0c: Status 404 returned error can't find the container with id 407d75b4d588e79d1e6151b9007edc8f0d8d4164a902e1f533f60dd23a4fbb0c Apr 17 17:44:19.303352 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.303324 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:19.303352 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.303365 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:44:19.305134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.305106 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:44:19.609999 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.609974 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5588885dd8-fx9rv_65c43d70-6989-455e-a571-8a88e61448a1/main/0.log" Apr 17 17:44:19.610481 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.610459 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:44:19.655970 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.655933 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656149 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.655987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656149 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656075 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656149 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656110 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656149 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656141 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzm7j\" (UniqueName: \"kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656190 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home\") pod \"65c43d70-6989-455e-a571-8a88e61448a1\" (UID: \"65c43d70-6989-455e-a571-8a88e61448a1\") " Apr 17 17:44:19.656807 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656739 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home" (OuterVolumeSpecName: "home") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:19.659481 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.656978 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache" (OuterVolumeSpecName: "model-cache") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:19.659481 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.659240 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:19.659848 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.659817 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j" (OuterVolumeSpecName: "kube-api-access-tzm7j") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "kube-api-access-tzm7j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:19.659936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.659901 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm" (OuterVolumeSpecName: "dshm") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:19.723065 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.722961 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65c43d70-6989-455e-a571-8a88e61448a1" (UID: "65c43d70-6989-455e-a571-8a88e61448a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:19.757550 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757519 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:19.757719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757572 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65c43d70-6989-455e-a571-8a88e61448a1-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:19.757719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757590 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzm7j\" (UniqueName: \"kubernetes.io/projected/65c43d70-6989-455e-a571-8a88e61448a1-kube-api-access-tzm7j\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:19.757719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757600 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:19.757719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757612 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:19.757719 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:19.757623 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65c43d70-6989-455e-a571-8a88e61448a1-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:44:20.051804 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.051718 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerStarted","Data":"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580"} Apr 17 17:44:20.051804 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.051765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerStarted","Data":"407d75b4d588e79d1e6151b9007edc8f0d8d4164a902e1f533f60dd23a4fbb0c"} Apr 17 17:44:20.053097 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053076 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5588885dd8-fx9rv_65c43d70-6989-455e-a571-8a88e61448a1/main/0.log" Apr 17 17:44:20.053464 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053431 2573 generic.go:358] "Generic (PLEG): container finished" podID="65c43d70-6989-455e-a571-8a88e61448a1" containerID="c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095" exitCode=137 Apr 17 17:44:20.053547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053474 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerDied","Data":"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095"} Apr 17 17:44:20.053547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" event={"ID":"65c43d70-6989-455e-a571-8a88e61448a1","Type":"ContainerDied","Data":"2f881d9f0046ed7621bdd452f8f5982297546d92b6ad18545f043767a456bb8a"} Apr 17 17:44:20.053547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053508 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv" Apr 17 17:44:20.053547 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.053514 2573 scope.go:117] "RemoveContainer" containerID="c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095" Apr 17 17:44:20.075460 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.075416 2573 scope.go:117] "RemoveContainer" containerID="f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f" Apr 17 17:44:20.091781 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.091756 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:44:20.101458 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.101429 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5588885dd8-fx9rv"] Apr 17 17:44:20.149343 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.149057 2573 scope.go:117] "RemoveContainer" containerID="c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095" Apr 17 17:44:20.149738 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:44:20.149705 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095\": container with ID starting with c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095 not found: ID does not exist" containerID="c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095" Apr 17 17:44:20.149863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.149749 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095"} err="failed to get container status \"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095\": rpc error: code = NotFound desc = could not find container \"c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095\": container with ID starting with c7944e4758f3a2fbb46ea128317c0905f25f6ca0267692f78a4b33a73fe88095 not found: ID does not exist" Apr 17 17:44:20.149863 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.149779 2573 scope.go:117] "RemoveContainer" containerID="f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f" Apr 17 17:44:20.150313 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:44:20.150282 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f\": container with ID starting with f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f not found: ID does not exist" containerID="f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f" Apr 17 17:44:20.150411 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:20.150321 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f"} err="failed to get container status \"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f\": rpc error: code = NotFound desc = could not find container \"f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f\": container with ID starting with f67d16ff561cd3b466997c62cc3c35feabb539aeab4df4f920f3f5df93e85a1f not found: ID does not exist" Apr 17 17:44:21.197359 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:21.197321 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c43d70-6989-455e-a571-8a88e61448a1" path="/var/lib/kubelet/pods/65c43d70-6989-455e-a571-8a88e61448a1/volumes" Apr 17 17:44:24.079641 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:24.079604 2573 generic.go:358] "Generic (PLEG): container finished" podID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerID="5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580" exitCode=0 Apr 17 17:44:24.080273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:24.079646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerDied","Data":"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580"} Apr 17 17:44:25.085799 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:25.085762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerStarted","Data":"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171"} Apr 17 17:44:25.112381 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:25.112338 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podStartSLOduration=7.112323097 podStartE2EDuration="7.112323097s" podCreationTimestamp="2026-04-17 17:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:25.108948206 +0000 UTC m=+1210.425030048" watchObservedRunningTime="2026-04-17 17:44:25.112323097 +0000 UTC m=+1210.428404945" Apr 17 17:44:29.111435 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.111396 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:29.111818 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.111448 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:44:29.113392 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.113360 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:44:29.222669 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.222626 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:44:29.240505 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.240470 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:44:29.303492 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:29.303443 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:44:39.111316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:39.111264 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:44:39.222067 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:39.222005 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:44:39.303514 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:39.303469 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:44:49.111165 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:49.111122 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:44:49.222061 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:49.222003 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:44:49.303573 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:49.303509 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:44:59.111461 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:59.111410 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:44:59.222739 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:59.222691 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:44:59.303600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:44:59.303554 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:09.111698 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:09.111578 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:09.222441 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:09.222386 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:09.304006 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:09.303955 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:19.111384 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:19.111342 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:19.222522 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:19.222451 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:19.303896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:19.303858 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:29.111396 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:29.111351 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:29.222239 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:29.222197 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:29.302873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:29.302840 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:39.111612 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:39.111560 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:39.222376 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:39.222339 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:39.303123 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:39.303075 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:49.111075 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:49.111028 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:49.222750 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:49.222705 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:49.303924 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:49.303886 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:45:59.111228 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:59.111188 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:45:59.222740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:59.222700 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:45:59.303386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:45:59.303341 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:09.111730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:09.111683 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:09.222708 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:09.222666 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:09.303944 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:09.303904 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:19.111493 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:19.111446 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:19.221968 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:19.221922 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:19.303144 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:19.303099 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:29.111776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:29.111733 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:29.222242 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:29.222203 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:29.303768 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:29.303731 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:39.111286 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:39.111199 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:39.222426 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:39.222388 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:39.303545 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:39.303507 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:49.110893 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:49.110852 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:49.222132 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:49.222081 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:49.303295 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:49.303257 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" probeResult="failure" output="Get \"https://10.133.0.30:8000/health\": dial tcp 10.133.0.30:8000: connect: connection refused" Apr 17 17:46:59.111301 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:59.111254 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 17 17:46:59.222252 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:59.222216 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8001/health\": dial tcp 10.133.0.29:8001: connect: connection refused" Apr 17 17:46:59.313467 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:59.313435 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:46:59.321141 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:46:59.321114 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:47:09.120585 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:09.120551 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:47:09.128319 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:09.128288 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:47:09.232451 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:09.232423 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:47:09.244979 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:09.244955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:47:14.845833 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:14.845796 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:47:14.846207 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:14.846094 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" containerID="cri-o://f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171" gracePeriod=30 Apr 17 17:47:19.324674 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.324640 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:47:19.325045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.324953 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" Apr 17 17:47:19.325045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.324964 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" Apr 17 17:47:19.325045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.324973 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="storage-initializer" Apr 17 17:47:19.325045 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.324979 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="storage-initializer" Apr 17 17:47:19.325179 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.325066 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="65c43d70-6989-455e-a571-8a88e61448a1" containerName="main" Apr 17 17:47:19.329735 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.329712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.332936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.332917 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-ck8l2\"" Apr 17 17:47:19.333061 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.332965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 17:47:19.337888 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.337864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:47:19.429521 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.429715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.429715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.429715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.429715 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.429932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.429718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gvt\" (UniqueName: \"kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.530948 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.530910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.530956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.530992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531134 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531369 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531216 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88gvt\" (UniqueName: \"kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531422 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531501 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.531501 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.531494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.533304 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.533283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.533475 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.533458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.541285 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.541257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gvt\" (UniqueName: \"kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.641630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.641584 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:19.798630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.798603 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:47:19.800831 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:47:19.800798 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc019f0cd_4f9d_4493_9b5a_1ce1b0d9ffaf.slice/crio-e91e35233c02345427c5e22acfe58452fb5a9ac094c57023317a5a0d8a357ea3 WatchSource:0}: Error finding container e91e35233c02345427c5e22acfe58452fb5a9ac094c57023317a5a0d8a357ea3: Status 404 returned error can't find the container with id e91e35233c02345427c5e22acfe58452fb5a9ac094c57023317a5a0d8a357ea3 Apr 17 17:47:19.802939 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:19.802919 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:47:20.678775 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:20.678689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerStarted","Data":"2c4a7e917d3fd80d0358efa21175ee7f6b9cc49a4a63d09b677171a221907ce8"} Apr 17 17:47:20.678775 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:20.678739 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerStarted","Data":"e91e35233c02345427c5e22acfe58452fb5a9ac094c57023317a5a0d8a357ea3"} Apr 17 17:47:24.695434 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:24.695401 2573 generic.go:358] "Generic (PLEG): container finished" podID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerID="2c4a7e917d3fd80d0358efa21175ee7f6b9cc49a4a63d09b677171a221907ce8" exitCode=0 Apr 17 17:47:24.695853 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:24.695480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerDied","Data":"2c4a7e917d3fd80d0358efa21175ee7f6b9cc49a4a63d09b677171a221907ce8"} Apr 17 17:47:25.700866 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:25.700829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerStarted","Data":"28df246bc4b95776fd1ab37f0f5fd943d6cf0d15f50c3e10aa136368dcba0d44"} Apr 17 17:47:25.722328 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:25.722269 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.722248716 podStartE2EDuration="6.722248716s" podCreationTimestamp="2026-04-17 17:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:25.720606222 +0000 UTC m=+1391.036688065" watchObservedRunningTime="2026-04-17 17:47:25.722248716 +0000 UTC m=+1391.038330555" Apr 17 17:47:29.642448 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:29.642418 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:29.643866 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:29.643838 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:47:30.038614 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:30.038524 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:47:30.039352 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:30.039290 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" containerID="cri-o://563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe" gracePeriod=30 Apr 17 17:47:30.046048 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:30.046006 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:47:30.046794 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:30.046700 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" containerID="cri-o://dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" gracePeriod=30 Apr 17 17:47:34.405120 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.405065 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:47:34.439241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.439211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:47:34.439438 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.439414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.439668 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.439421 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:47:34.442136 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.442113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-bdl4z\"" Apr 17 17:47:34.442255 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.442135 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 17:47:34.454403 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.454329 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:47:34.454497 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.454436 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.466762 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.466848 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.466896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dkk\" (UniqueName: \"kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.466936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.466936 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.467066 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.466967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dkk\" (UniqueName: \"kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbvg\" (UniqueName: \"kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.568712 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.569006 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.569006 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.568888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.570626 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.570596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.570839 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.570822 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.576910 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.576882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dkk\" (UniqueName: \"kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk\") pod \"custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.669113 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669113 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbvg\" (UniqueName: \"kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669511 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669803 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.669803 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.669792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.671456 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.671430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.671600 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.671585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.684316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.684281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbvg\" (UniqueName: \"kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg\") pod \"custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.751629 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.751602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:34.765495 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.765466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:34.906771 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.906691 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:47:34.909997 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:47:34.909963 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c6c05e1_9ed7_47cf_b3d9_fb52c0dcbaf8.slice/crio-a1eea0a9d2ceab0d26db66e85071e82b6e528027406b4b15c148e3cf4bd05d60 WatchSource:0}: Error finding container a1eea0a9d2ceab0d26db66e85071e82b6e528027406b4b15c148e3cf4bd05d60: Status 404 returned error can't find the container with id a1eea0a9d2ceab0d26db66e85071e82b6e528027406b4b15c148e3cf4bd05d60 Apr 17 17:47:34.929275 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:34.929157 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:47:34.931202 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:47:34.931177 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod030ac2a5_cc7d_49c9_ae10_69cc1c17fd8b.slice/crio-2ca64a0ea80620d279565126c1345fefc34fddac1107e0f1a8fbc87e0b71136a WatchSource:0}: Error finding container 2ca64a0ea80620d279565126c1345fefc34fddac1107e0f1a8fbc87e0b71136a: Status 404 returned error can't find the container with id 2ca64a0ea80620d279565126c1345fefc34fddac1107e0f1a8fbc87e0b71136a Apr 17 17:47:35.734830 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:35.734790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerStarted","Data":"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25"} Apr 17 17:47:35.735297 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:35.735048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerStarted","Data":"a1eea0a9d2ceab0d26db66e85071e82b6e528027406b4b15c148e3cf4bd05d60"} Apr 17 17:47:35.735297 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:35.735085 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:35.736482 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:35.736452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerStarted","Data":"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489"} Apr 17 17:47:35.736623 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:35.736488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerStarted","Data":"2ca64a0ea80620d279565126c1345fefc34fddac1107e0f1a8fbc87e0b71136a"} Apr 17 17:47:36.742189 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:36.742144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerStarted","Data":"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8"} Apr 17 17:47:39.642895 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:39.642847 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:47:39.754841 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:39.754803 2573 generic.go:358] "Generic (PLEG): container finished" podID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerID="f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489" exitCode=0 Apr 17 17:47:39.755081 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:39.754857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerDied","Data":"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489"} Apr 17 17:47:40.760656 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:40.760621 2573 generic.go:358] "Generic (PLEG): container finished" podID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerID="cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8" exitCode=0 Apr 17 17:47:40.761085 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:40.760699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerDied","Data":"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8"} Apr 17 17:47:40.762375 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:40.762341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerStarted","Data":"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b"} Apr 17 17:47:40.807384 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:40.807243 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podStartSLOduration=6.807226056 podStartE2EDuration="6.807226056s" podCreationTimestamp="2026-04-17 17:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:40.806297997 +0000 UTC m=+1406.122379849" watchObservedRunningTime="2026-04-17 17:47:40.807226056 +0000 UTC m=+1406.123307898" Apr 17 17:47:41.768121 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:41.768073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerStarted","Data":"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c"} Apr 17 17:47:41.794700 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:41.794648 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podStartSLOduration=7.794630117 podStartE2EDuration="7.794630117s" podCreationTimestamp="2026-04-17 17:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:41.79208106 +0000 UTC m=+1407.108162905" watchObservedRunningTime="2026-04-17 17:47:41.794630117 +0000 UTC m=+1407.110711958" Apr 17 17:47:44.752081 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.752038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:44.752081 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.752074 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:44.753512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.753476 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:47:44.766074 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.766039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:44.766228 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.766202 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:47:44.767305 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:44.767271 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:47:45.152740 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.152710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx_772f98d2-3f0e-4fc5-a59b-75a7337b9299/main/0.log" Apr 17 17:47:45.153180 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.153158 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:47:45.279032 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.278993 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279057 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vj5r\" (UniqueName: \"kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279104 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279142 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279179 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279206 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs\") pod \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\" (UID: \"772f98d2-3f0e-4fc5-a59b-75a7337b9299\") " Apr 17 17:47:45.279780 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.279707 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache" (OuterVolumeSpecName: "model-cache") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:45.280207 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.280144 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home" (OuterVolumeSpecName: "home") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:45.282307 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.282269 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:47:45.282515 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.282486 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r" (OuterVolumeSpecName: "kube-api-access-9vj5r") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "kube-api-access-9vj5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:47:45.283281 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.283241 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm" (OuterVolumeSpecName: "dshm") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:45.360295 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.360199 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "772f98d2-3f0e-4fc5-a59b-75a7337b9299" (UID: "772f98d2-3f0e-4fc5-a59b-75a7337b9299"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:45.380527 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380482 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.380527 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380527 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.380749 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380546 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.380749 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380562 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772f98d2-3f0e-4fc5-a59b-75a7337b9299-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.380749 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380579 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772f98d2-3f0e-4fc5-a59b-75a7337b9299-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.380749 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.380593 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vj5r\" (UniqueName: \"kubernetes.io/projected/772f98d2-3f0e-4fc5-a59b-75a7337b9299-kube-api-access-9vj5r\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:47:45.785326 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785292 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx_772f98d2-3f0e-4fc5-a59b-75a7337b9299/main/0.log" Apr 17 17:47:45.785865 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785659 2573 generic.go:358] "Generic (PLEG): container finished" podID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerID="f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171" exitCode=137 Apr 17 17:47:45.785865 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerDied","Data":"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171"} Apr 17 17:47:45.785865 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" event={"ID":"772f98d2-3f0e-4fc5-a59b-75a7337b9299","Type":"ContainerDied","Data":"407d75b4d588e79d1e6151b9007edc8f0d8d4164a902e1f533f60dd23a4fbb0c"} Apr 17 17:47:45.785865 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785737 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx" Apr 17 17:47:45.785865 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.785816 2573 scope.go:117] "RemoveContainer" containerID="f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171" Apr 17 17:47:45.808824 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.808794 2573 scope.go:117] "RemoveContainer" containerID="5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580" Apr 17 17:47:45.809338 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.809189 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:47:45.813542 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.813512 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9499b6fdb-xvmcx"] Apr 17 17:47:45.877640 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.877484 2573 scope.go:117] "RemoveContainer" containerID="f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171" Apr 17 17:47:45.877912 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:47:45.877891 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171\": container with ID starting with f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171 not found: ID does not exist" containerID="f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171" Apr 17 17:47:45.878062 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.877924 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171"} err="failed to get container status \"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171\": rpc error: code = NotFound desc = could not find container \"f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171\": container with ID starting with f67aa4ceee8465aba7e4f43fd388d8e47051779ccbf96543dae2a9f98cea9171 not found: ID does not exist" Apr 17 17:47:45.878062 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.877952 2573 scope.go:117] "RemoveContainer" containerID="5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580" Apr 17 17:47:45.878292 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:47:45.878273 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580\": container with ID starting with 5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580 not found: ID does not exist" containerID="5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580" Apr 17 17:47:45.878355 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:45.878309 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580"} err="failed to get container status \"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580\": rpc error: code = NotFound desc = could not find container \"5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580\": container with ID starting with 5d236da04cf5d2d518a3971e676f2ce15629837276799ac0ac2d92fb9bf36580 not found: ID does not exist" Apr 17 17:47:47.198776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:47.198738 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" path="/var/lib/kubelet/pods/772f98d2-3f0e-4fc5-a59b-75a7337b9299/volumes" Apr 17 17:47:49.642483 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:49.642452 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:47:49.642945 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:49.642753 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:47:54.752533 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:54.752006 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:47:54.765896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:54.765844 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:47:54.766416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:54.766391 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:47:59.642580 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:47:59.642541 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:00.047420 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.047318 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="llm-d-routing-sidecar" containerID="cri-o://f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" gracePeriod=2 Apr 17 17:48:00.393644 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.393612 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2_c4290424-af92-4f3f-b633-7ff5db8c3d58/main/0.log" Apr 17 17:48:00.394488 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.394390 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:48:00.408657 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.408631 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:48:00.523653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.523622 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.523847 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.523682 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.523847 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.523718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg2tt\" (UniqueName: \"kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.523847 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.523751 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.524110 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.523981 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache" (OuterVolumeSpecName: "model-cache") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.524110 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524076 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home" (OuterVolumeSpecName: "home") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.524201 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524141 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.524201 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524181 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.524312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524232 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.524312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524261 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.524425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524337 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h2cx\" (UniqueName: \"kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.524425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524366 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location\") pod \"c4290424-af92-4f3f-b633-7ff5db8c3d58\" (UID: \"c4290424-af92-4f3f-b633-7ff5db8c3d58\") " Apr 17 17:48:00.524425 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524410 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.524595 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524437 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location\") pod \"487f6e70-3644-4837-b534-8f962a654798\" (UID: \"487f6e70-3644-4837-b534-8f962a654798\") " Apr 17 17:48:00.524743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524706 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.524743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.524731 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.526343 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.525220 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home" (OuterVolumeSpecName: "home") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.526343 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.525716 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache" (OuterVolumeSpecName: "model-cache") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.526970 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.526888 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm" (OuterVolumeSpecName: "dshm") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.526970 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.526949 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm" (OuterVolumeSpecName: "dshm") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.529193 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.529158 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx" (OuterVolumeSpecName: "kube-api-access-5h2cx") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "kube-api-access-5h2cx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:48:00.529638 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.529589 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:48:00.529840 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.529801 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt" (OuterVolumeSpecName: "kube-api-access-zg2tt") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "kube-api-access-zg2tt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:48:00.531386 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.531348 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:48:00.584757 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.584693 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c4290424-af92-4f3f-b633-7ff5db8c3d58" (UID: "c4290424-af92-4f3f-b633-7ff5db8c3d58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.619406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.619357 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "487f6e70-3644-4837-b534-8f962a654798" (UID: "487f6e70-3644-4837-b534-8f962a654798"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:48:00.625896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625860 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.625896 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625899 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625915 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4290424-af92-4f3f-b633-7ff5db8c3d58-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625933 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zg2tt\" (UniqueName: \"kubernetes.io/projected/487f6e70-3644-4837-b534-8f962a654798-kube-api-access-zg2tt\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625946 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/487f6e70-3644-4837-b534-8f962a654798-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625957 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625968 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625980 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/487f6e70-3644-4837-b534-8f962a654798-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.625991 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5h2cx\" (UniqueName: \"kubernetes.io/projected/c4290424-af92-4f3f-b633-7ff5db8c3d58-kube-api-access-5h2cx\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.626105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.626004 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4290424-af92-4f3f-b633-7ff5db8c3d58-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:48:00.848380 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.848347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2_c4290424-af92-4f3f-b633-7ff5db8c3d58/main/0.log" Apr 17 17:48:00.849133 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849107 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerID="dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" exitCode=137 Apr 17 17:48:00.849133 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849133 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerID="f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" exitCode=0 Apr 17 17:48:00.849322 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849202 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" Apr 17 17:48:00.849322 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerDied","Data":"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb"} Apr 17 17:48:00.849322 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerDied","Data":"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39"} Apr 17 17:48:00.849322 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2" event={"ID":"c4290424-af92-4f3f-b633-7ff5db8c3d58","Type":"ContainerDied","Data":"de23127e38e87e0e5fa2e598b1b460aafcb12ec5abf88c3758a61720dd32d5c6"} Apr 17 17:48:00.849322 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.849285 2573 scope.go:117] "RemoveContainer" containerID="dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" Apr 17 17:48:00.850903 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.850878 2573 generic.go:358] "Generic (PLEG): container finished" podID="487f6e70-3644-4837-b534-8f962a654798" containerID="563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe" exitCode=137 Apr 17 17:48:00.851047 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.850923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerDied","Data":"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe"} Apr 17 17:48:00.851047 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.850945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" event={"ID":"487f6e70-3644-4837-b534-8f962a654798","Type":"ContainerDied","Data":"7c21f333ad65318a34f1423702dd67133939eb4d655d3eaedcf193910655aeda"} Apr 17 17:48:00.851182 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.851048 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2" Apr 17 17:48:00.873574 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.873550 2573 scope.go:117] "RemoveContainer" containerID="a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a" Apr 17 17:48:00.883953 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.883926 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:48:00.890445 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.890419 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7684bd49854bvf2"] Apr 17 17:48:00.902837 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.902810 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:48:00.908430 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.908406 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54r46z2"] Apr 17 17:48:00.941225 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.941202 2573 scope.go:117] "RemoveContainer" containerID="f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" Apr 17 17:48:00.951387 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.951361 2573 scope.go:117] "RemoveContainer" containerID="dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" Apr 17 17:48:00.951769 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:48:00.951718 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb\": container with ID starting with dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb not found: ID does not exist" containerID="dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" Apr 17 17:48:00.951882 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.951784 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb"} err="failed to get container status \"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb\": rpc error: code = NotFound desc = could not find container \"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb\": container with ID starting with dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb not found: ID does not exist" Apr 17 17:48:00.951882 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.951811 2573 scope.go:117] "RemoveContainer" containerID="a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a" Apr 17 17:48:00.952204 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:48:00.952167 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a\": container with ID starting with a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a not found: ID does not exist" containerID="a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a" Apr 17 17:48:00.952303 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952204 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a"} err="failed to get container status \"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a\": rpc error: code = NotFound desc = could not find container \"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a\": container with ID starting with a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a not found: ID does not exist" Apr 17 17:48:00.952303 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952231 2573 scope.go:117] "RemoveContainer" containerID="f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" Apr 17 17:48:00.952536 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:48:00.952497 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39\": container with ID starting with f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39 not found: ID does not exist" containerID="f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" Apr 17 17:48:00.952536 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952534 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39"} err="failed to get container status \"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39\": rpc error: code = NotFound desc = could not find container \"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39\": container with ID starting with f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39 not found: ID does not exist" Apr 17 17:48:00.952714 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952554 2573 scope.go:117] "RemoveContainer" containerID="dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb" Apr 17 17:48:00.952945 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952910 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb"} err="failed to get container status \"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb\": rpc error: code = NotFound desc = could not find container \"dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb\": container with ID starting with dc2e4e23860706df8fe1b02a267913b7bda217fcca33a6f699d9785862a19eeb not found: ID does not exist" Apr 17 17:48:00.952945 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.952938 2573 scope.go:117] "RemoveContainer" containerID="a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a" Apr 17 17:48:00.953234 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.953206 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a"} err="failed to get container status \"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a\": rpc error: code = NotFound desc = could not find container \"a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a\": container with ID starting with a60bc395ccdf74c46d68aabc3b20b853491eb4fcffdc09a331fdef6581315c4a not found: ID does not exist" Apr 17 17:48:00.953309 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.953237 2573 scope.go:117] "RemoveContainer" containerID="f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39" Apr 17 17:48:00.953521 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.953498 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39"} err="failed to get container status \"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39\": rpc error: code = NotFound desc = could not find container \"f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39\": container with ID starting with f16fa6a8d7815ba976da21eb815d3c6ad509fd68c0c0a4806370344dfaadec39 not found: ID does not exist" Apr 17 17:48:00.953598 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.953525 2573 scope.go:117] "RemoveContainer" containerID="563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe" Apr 17 17:48:00.979666 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:00.979642 2573 scope.go:117] "RemoveContainer" containerID="897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec" Apr 17 17:48:01.049815 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.049604 2573 scope.go:117] "RemoveContainer" containerID="563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe" Apr 17 17:48:01.049999 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:48:01.049969 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe\": container with ID starting with 563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe not found: ID does not exist" containerID="563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe" Apr 17 17:48:01.050126 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.050030 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe"} err="failed to get container status \"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe\": rpc error: code = NotFound desc = could not find container \"563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe\": container with ID starting with 563d85d0f2d9f29e831264f4de1457776a1616427eef4e52dedd48aad33049fe not found: ID does not exist" Apr 17 17:48:01.050126 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.050062 2573 scope.go:117] "RemoveContainer" containerID="897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec" Apr 17 17:48:01.050394 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:48:01.050373 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec\": container with ID starting with 897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec not found: ID does not exist" containerID="897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec" Apr 17 17:48:01.050485 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.050401 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec"} err="failed to get container status \"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec\": rpc error: code = NotFound desc = could not find container \"897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec\": container with ID starting with 897786768afb35c9d97cd2279f0b63eee4d5624ef6fcdd021e6b52b95649b9ec not found: ID does not exist" Apr 17 17:48:01.197077 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.196964 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487f6e70-3644-4837-b534-8f962a654798" path="/var/lib/kubelet/pods/487f6e70-3644-4837-b534-8f962a654798/volumes" Apr 17 17:48:01.197642 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:01.197569 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" path="/var/lib/kubelet/pods/c4290424-af92-4f3f-b633-7ff5db8c3d58/volumes" Apr 17 17:48:04.752990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:04.752892 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:04.765890 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:04.765852 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:09.643298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:09.643246 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:14.752509 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:14.752460 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:14.766227 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:14.766183 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:19.643356 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:19.643314 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:24.753116 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:24.753066 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:24.765954 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:24.765923 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:29.642970 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:29.642928 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:34.752792 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:34.752739 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:34.765870 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:34.765842 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:39.642927 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:39.642888 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:44.752871 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:44.752827 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:44.766326 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:44.766296 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:49.643378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:49.643337 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:48:54.752105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:54.752061 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:48:54.766605 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:54.766565 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:48:59.642779 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:48:59.642740 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:49:04.752703 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:04.752648 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:04.765835 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:04.765800 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:49:09.642708 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:09.642658 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:49:14.752159 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:14.752109 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:14.766043 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:14.765994 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:49:19.643251 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:19.643209 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:49:24.753109 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:24.753064 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:24.766268 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:24.766228 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:49:29.643010 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:29.642966 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:49:34.753192 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:34.753086 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:34.766442 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:34.766411 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:49:39.642625 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:39.642587 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 17:49:44.752250 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:44.752196 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:44.766190 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:44.766158 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:49:49.652399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:49.652359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:49:49.659899 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:49.659869 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:49:54.752544 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:54.752491 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:49:54.765990 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:49:54.765948 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:50:04.752097 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:04.752047 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:50:04.766241 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:04.766198 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:50:04.974215 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:04.974184 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:50:04.974502 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:04.974477 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" containerID="cri-o://28df246bc4b95776fd1ab37f0f5fd943d6cf0d15f50c3e10aa136368dcba0d44" gracePeriod=30 Apr 17 17:50:06.270705 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.270643 2573 generic.go:358] "Generic (PLEG): container finished" podID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerID="28df246bc4b95776fd1ab37f0f5fd943d6cf0d15f50c3e10aa136368dcba0d44" exitCode=0 Apr 17 17:50:06.270705 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.270680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerDied","Data":"28df246bc4b95776fd1ab37f0f5fd943d6cf0d15f50c3e10aa136368dcba0d44"} Apr 17 17:50:06.333811 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.333790 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:50:06.398587 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398562 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.398730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398598 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88gvt\" (UniqueName: \"kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.398730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398623 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.398730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398648 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.398730 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398694 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.398952 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.398752 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home\") pod \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\" (UID: \"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf\") " Apr 17 17:50:06.399287 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.399230 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home" (OuterVolumeSpecName: "home") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.399409 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.399314 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache" (OuterVolumeSpecName: "model-cache") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.401226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.401197 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm" (OuterVolumeSpecName: "dshm") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.401226 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.401203 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:50:06.401408 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.401207 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt" (OuterVolumeSpecName: "kube-api-access-88gvt") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "kube-api-access-88gvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:50:06.449798 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.449717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" (UID: "c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.499319 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499291 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.499319 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499313 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.499319 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499323 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.499568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499334 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88gvt\" (UniqueName: \"kubernetes.io/projected/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kube-api-access-88gvt\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.499568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499344 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.499568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:06.499353 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:50:07.278597 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.278554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf","Type":"ContainerDied","Data":"e91e35233c02345427c5e22acfe58452fb5a9ac094c57023317a5a0d8a357ea3"} Apr 17 17:50:07.278597 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.278607 2573 scope.go:117] "RemoveContainer" containerID="28df246bc4b95776fd1ab37f0f5fd943d6cf0d15f50c3e10aa136368dcba0d44" Apr 17 17:50:07.279159 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.278567 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 17:50:07.297316 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.297284 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:50:07.298145 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.298115 2573 scope.go:117] "RemoveContainer" containerID="2c4a7e917d3fd80d0358efa21175ee7f6b9cc49a4a63d09b677171a221907ce8" Apr 17 17:50:07.300848 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:07.300825 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 17:50:09.196150 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:09.196119 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" path="/var/lib/kubelet/pods/c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf/volumes" Apr 17 17:50:14.752221 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:14.752167 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8001/health\": dial tcp 10.133.0.33:8001: connect: connection refused" Apr 17 17:50:14.766693 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:14.766651 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 17:50:24.762190 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:24.762158 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:50:24.775431 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:24.775408 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:50:24.775744 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:24.775726 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:50:24.783264 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:24.783214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:50:53.697773 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.697740 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:50:53.700493 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.698129 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" containerID="cri-o://0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b" gracePeriod=30 Apr 17 17:50:53.712873 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.712846 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:50:53.713374 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.713312 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" containerID="cri-o://1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" gracePeriod=30 Apr 17 17:50:53.814861 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.814831 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8"] Apr 17 17:50:53.815191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815166 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" Apr 17 17:50:53.815191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815182 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" Apr 17 17:50:53.815191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815194 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815200 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815207 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815213 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815226 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815231 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815237 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815243 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815248 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815253 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815260 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="llm-d-routing-sidecar" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815265 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="llm-d-routing-sidecar" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815271 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815276 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="storage-initializer" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815282 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815289 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815355 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="llm-d-routing-sidecar" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815367 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4290424-af92-4f3f-b633-7ff5db8c3d58" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815376 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="772f98d2-3f0e-4fc5-a59b-75a7337b9299" containerName="main" Apr 17 17:50:53.815378 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815387 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c019f0cd-4f9d-4493-9b5a-1ce1b0d9ffaf" containerName="main" Apr 17 17:50:53.816255 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.815397 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="487f6e70-3644-4837-b534-8f962a654798" containerName="main" Apr 17 17:50:53.818417 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.818398 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.820750 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.820728 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 17 17:50:53.820926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.820910 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-jn5w9\"" Apr 17 17:50:53.833992 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.833970 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8"] Apr 17 17:50:53.896372 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896345 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896467 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896467 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896467 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c54b1a9-e44d-4291-acc0-695eaeefb002-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896603 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896699 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896699 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.896809 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.896711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fspn2\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-kube-api-access-fspn2\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998092 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998265 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fspn2\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-kube-api-access-fspn2\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998265 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998265 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998412 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998412 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c54b1a9-e44d-4291-acc0-695eaeefb002-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998412 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998567 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998627 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998690 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.998747 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.998728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:53.999107 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:53.999088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1c54b1a9-e44d-4291-acc0-695eaeefb002-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.000314 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.000288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.000687 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.000668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.005950 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.005930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fspn2\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-kube-api-access-fspn2\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.007376 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.007354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1c54b1a9-e44d-4291-acc0-695eaeefb002-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-q97s8\" (UID: \"1c54b1a9-e44d-4291-acc0-695eaeefb002\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.128690 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.128665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:54.250105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.250039 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8"] Apr 17 17:50:54.253271 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:50:54.253246 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c54b1a9_e44d_4291_acc0_695eaeefb002.slice/crio-66aab096df563f3b1d84284e7d626ed2b50523688398f84a5d5903c94f415dbf WatchSource:0}: Error finding container 66aab096df563f3b1d84284e7d626ed2b50523688398f84a5d5903c94f415dbf: Status 404 returned error can't find the container with id 66aab096df563f3b1d84284e7d626ed2b50523688398f84a5d5903c94f415dbf Apr 17 17:50:54.431837 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:54.431806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" event={"ID":"1c54b1a9-e44d-4291-acc0-695eaeefb002","Type":"ContainerStarted","Data":"66aab096df563f3b1d84284e7d626ed2b50523688398f84a5d5903c94f415dbf"} Apr 17 17:50:57.056679 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:57.056639 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 17 17:50:57.057035 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:57.056713 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 17 17:50:57.057035 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:57.056748 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 17 17:50:57.442466 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:57.442433 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" event={"ID":"1c54b1a9-e44d-4291-acc0-695eaeefb002","Type":"ContainerStarted","Data":"54b4b8ff0d9e4aa4ab65ee5453feb367737581d9c7260ccb17d4525d1c93a51c"} Apr 17 17:50:57.464598 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:57.464551 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" podStartSLOduration=1.6632263630000002 podStartE2EDuration="4.464532572s" podCreationTimestamp="2026-04-17 17:50:53 +0000 UTC" firstStartedPulling="2026-04-17 17:50:54.255108257 +0000 UTC m=+1599.571190078" lastFinishedPulling="2026-04-17 17:50:57.056414452 +0000 UTC m=+1602.372496287" observedRunningTime="2026-04-17 17:50:57.463071113 +0000 UTC m=+1602.779152958" watchObservedRunningTime="2026-04-17 17:50:57.464532572 +0000 UTC m=+1602.780614393" Apr 17 17:50:58.129787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:58.129762 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:50:58.131091 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:58.131055 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" podUID="1c54b1a9-e44d-4291-acc0-695eaeefb002" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.35:15021/healthz/ready\": dial tcp 10.133.0.35:15021: connect: connection refused" Apr 17 17:50:59.129802 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:50:59.129761 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" podUID="1c54b1a9-e44d-4291-acc0-695eaeefb002" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.35:15021/healthz/ready\": dial tcp 10.133.0.35:15021: connect: connection refused" Apr 17 17:51:00.132787 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:00.132756 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:51:00.453073 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:00.452983 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:51:00.453987 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:00.453968 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-q97s8" Apr 17 17:51:22.833221 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.833137 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:51:22.837064 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.837010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.839568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.839546 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 17:51:22.848307 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.848281 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:51:22.954576 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.954576 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hsn\" (UniqueName: \"kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.954791 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.954791 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.954791 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:22.954791 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:22.954704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055200 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hsn\" (UniqueName: \"kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055486 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055486 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055638 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055698 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.055750 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.055738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.057412 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.057387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.057776 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.057755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.064769 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.064746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hsn\" (UniqueName: \"kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn\") pod \"router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.150910 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.150876 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:23.266929 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.266896 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:51:23.269435 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:51:23.269402 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca78991e_2860_4ef0_8c44_de6fd51082f3.slice/crio-0dde35c4b49a3d6883b6cc66c6c55bd90de353658e6322d7959c963323132e0a WatchSource:0}: Error finding container 0dde35c4b49a3d6883b6cc66c6c55bd90de353658e6322d7959c963323132e0a: Status 404 returned error can't find the container with id 0dde35c4b49a3d6883b6cc66c6c55bd90de353658e6322d7959c963323132e0a Apr 17 17:51:23.527249 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.527170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerStarted","Data":"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd"} Apr 17 17:51:23.527249 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.527207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerStarted","Data":"0dde35c4b49a3d6883b6cc66c6c55bd90de353658e6322d7959c963323132e0a"} Apr 17 17:51:23.713800 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.713762 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="llm-d-routing-sidecar" containerID="cri-o://d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" gracePeriod=2 Apr 17 17:51:23.996313 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:23.996288 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:51:24.002211 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.002187 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp_2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8/main/0.log" Apr 17 17:51:24.002864 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.002839 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:51:24.061808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061715 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.061808 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061789 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.062033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.062033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.062033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061943 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.062033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.061978 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7dkk\" (UniqueName: \"kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.062033 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062008 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.062300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062047 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.062300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.062300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062245 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home" (OuterVolumeSpecName: "home") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.062300 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062241 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache" (OuterVolumeSpecName: "model-cache") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062357 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache" (OuterVolumeSpecName: "model-cache") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062562 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home" (OuterVolumeSpecName: "home") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.062443 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location\") pod \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\" (UID: \"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8\") " Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063038 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063085 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbvg\" (UniqueName: \"kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg\") pod \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\" (UID: \"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b\") " Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063346 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063365 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063379 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.063926 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.063394 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.065379 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.065159 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:51:24.065991 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.065974 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm" (OuterVolumeSpecName: "dshm") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.066168 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.066135 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm" (OuterVolumeSpecName: "dshm") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.066582 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.066548 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk" (OuterVolumeSpecName: "kube-api-access-d7dkk") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "kube-api-access-d7dkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:51:24.066806 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.066785 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg" (OuterVolumeSpecName: "kube-api-access-nxbvg") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "kube-api-access-nxbvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:51:24.068310 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.068289 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:51:24.141188 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.141151 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" (UID: "030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.143431 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.143406 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" (UID: "2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:24.164330 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164308 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164330 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164330 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxbvg\" (UniqueName: \"kubernetes.io/projected/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kube-api-access-nxbvg\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164340 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164349 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164358 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7dkk\" (UniqueName: \"kubernetes.io/projected/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kube-api-access-d7dkk\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164366 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164376 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.164453 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.164385 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:51:24.532349 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.532323 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp_2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8/main/0.log" Apr 17 17:51:24.533419 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533355 2573 generic.go:358] "Generic (PLEG): container finished" podID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerID="1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" exitCode=137 Apr 17 17:51:24.533419 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533411 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" Apr 17 17:51:24.533419 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerDied","Data":"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c"} Apr 17 17:51:24.533653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533446 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerDied","Data":"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25"} Apr 17 17:51:24.533653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533462 2573 scope.go:117] "RemoveContainer" containerID="1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" Apr 17 17:51:24.533653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533418 2573 generic.go:358] "Generic (PLEG): container finished" podID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerID="d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" exitCode=0 Apr 17 17:51:24.533653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.533602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp" event={"ID":"2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8","Type":"ContainerDied","Data":"a1eea0a9d2ceab0d26db66e85071e82b6e528027406b4b15c148e3cf4bd05d60"} Apr 17 17:51:24.535473 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.535446 2573 generic.go:358] "Generic (PLEG): container finished" podID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerID="0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b" exitCode=137 Apr 17 17:51:24.535608 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.535504 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" Apr 17 17:51:24.535608 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.535522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerDied","Data":"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b"} Apr 17 17:51:24.535608 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.535561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7" event={"ID":"030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b","Type":"ContainerDied","Data":"2ca64a0ea80620d279565126c1345fefc34fddac1107e0f1a8fbc87e0b71136a"} Apr 17 17:51:24.555909 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.555891 2573 scope.go:117] "RemoveContainer" containerID="cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8" Apr 17 17:51:24.563494 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.563458 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:51:24.567540 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.567514 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b5496cb4d-lv2cp"] Apr 17 17:51:24.579663 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.579635 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:51:24.583271 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.583245 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-75777d65d9-k7tj7"] Apr 17 17:51:24.619826 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.619196 2573 scope.go:117] "RemoveContainer" containerID="d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" Apr 17 17:51:24.629191 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.629172 2573 scope.go:117] "RemoveContainer" containerID="1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" Apr 17 17:51:24.629483 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:51:24.629462 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c\": container with ID starting with 1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c not found: ID does not exist" containerID="1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" Apr 17 17:51:24.629580 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.629491 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c"} err="failed to get container status \"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c\": rpc error: code = NotFound desc = could not find container \"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c\": container with ID starting with 1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c not found: ID does not exist" Apr 17 17:51:24.629580 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.629509 2573 scope.go:117] "RemoveContainer" containerID="cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8" Apr 17 17:51:24.629771 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:51:24.629754 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8\": container with ID starting with cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8 not found: ID does not exist" containerID="cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8" Apr 17 17:51:24.629820 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.629777 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8"} err="failed to get container status \"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8\": rpc error: code = NotFound desc = could not find container \"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8\": container with ID starting with cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8 not found: ID does not exist" Apr 17 17:51:24.629820 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.629790 2573 scope.go:117] "RemoveContainer" containerID="d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" Apr 17 17:51:24.630049 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:51:24.630029 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25\": container with ID starting with d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25 not found: ID does not exist" containerID="d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" Apr 17 17:51:24.630101 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630054 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25"} err="failed to get container status \"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25\": rpc error: code = NotFound desc = could not find container \"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25\": container with ID starting with d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25 not found: ID does not exist" Apr 17 17:51:24.630101 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630070 2573 scope.go:117] "RemoveContainer" containerID="1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c" Apr 17 17:51:24.630292 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630274 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c"} err="failed to get container status \"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c\": rpc error: code = NotFound desc = could not find container \"1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c\": container with ID starting with 1ff3d399e35d3e87bf26c7d15e5059abe6f0ea7f4c300768ca11e27a6580323c not found: ID does not exist" Apr 17 17:51:24.630331 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630301 2573 scope.go:117] "RemoveContainer" containerID="cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8" Apr 17 17:51:24.630591 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630571 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8"} err="failed to get container status \"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8\": rpc error: code = NotFound desc = could not find container \"cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8\": container with ID starting with cdba6578612acddac698fe562aa54ee96bd6dcac3e49e73ef32896b354953fe8 not found: ID does not exist" Apr 17 17:51:24.630659 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630602 2573 scope.go:117] "RemoveContainer" containerID="d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25" Apr 17 17:51:24.630844 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630824 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25"} err="failed to get container status \"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25\": rpc error: code = NotFound desc = could not find container \"d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25\": container with ID starting with d14cacd181dbcc6b375bb208538dd52a91c064f52401e8d28803007394277c25 not found: ID does not exist" Apr 17 17:51:24.630895 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.630845 2573 scope.go:117] "RemoveContainer" containerID="0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b" Apr 17 17:51:24.650167 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.650145 2573 scope.go:117] "RemoveContainer" containerID="f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489" Apr 17 17:51:24.713516 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.713484 2573 scope.go:117] "RemoveContainer" containerID="0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b" Apr 17 17:51:24.713851 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:51:24.713829 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b\": container with ID starting with 0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b not found: ID does not exist" containerID="0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b" Apr 17 17:51:24.713932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.713864 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b"} err="failed to get container status \"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b\": rpc error: code = NotFound desc = could not find container \"0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b\": container with ID starting with 0799c2815589d644f0c31c013280acd81c9e3853d1cd2abf3bd3ce16030ddf2b not found: ID does not exist" Apr 17 17:51:24.713932 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.713895 2573 scope.go:117] "RemoveContainer" containerID="f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489" Apr 17 17:51:24.714284 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:51:24.714266 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489\": container with ID starting with f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489 not found: ID does not exist" containerID="f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489" Apr 17 17:51:24.714339 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:24.714291 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489"} err="failed to get container status \"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489\": rpc error: code = NotFound desc = could not find container \"f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489\": container with ID starting with f0e37f006073840b9726ff688eddaa600f4d08ef5d121741eb0f4206a5ecf489 not found: ID does not exist" Apr 17 17:51:25.195587 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:25.195556 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" path="/var/lib/kubelet/pods/030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b/volumes" Apr 17 17:51:25.195966 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:25.195952 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" path="/var/lib/kubelet/pods/2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8/volumes" Apr 17 17:51:28.552294 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:28.552212 2573 generic.go:358] "Generic (PLEG): container finished" podID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerID="0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd" exitCode=0 Apr 17 17:51:28.552641 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:28.552285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerDied","Data":"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd"} Apr 17 17:51:29.557538 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:29.557504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerStarted","Data":"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51"} Apr 17 17:51:29.583927 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:29.583870 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podStartSLOduration=7.583855317 podStartE2EDuration="7.583855317s" podCreationTimestamp="2026-04-17 17:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:51:29.581221112 +0000 UTC m=+1634.897302976" watchObservedRunningTime="2026-04-17 17:51:29.583855317 +0000 UTC m=+1634.899937159" Apr 17 17:51:33.151409 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:33.151366 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:33.151409 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:33.151417 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:51:33.152689 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:33.152660 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:51:43.151457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:43.151415 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:51:53.151325 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:51:53.151272 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:03.152324 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:03.152275 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:13.152285 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:13.152236 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:23.151580 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:23.151533 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:33.152237 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:33.152180 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:43.152105 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:43.151988 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:52:53.151879 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:52:53.151836 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8000/health\": dial tcp 10.133.0.36:8000: connect: connection refused" Apr 17 17:53:03.161328 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:53:03.161298 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:53:03.168811 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:53:03.168782 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:57:18.765699 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:18.765623 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:57:18.766148 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:18.765906 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" containerID="cri-o://2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51" gracePeriod=30 Apr 17 17:57:34.155161 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:34.155127 2573 ???:1] "http: TLS handshake error from 10.0.139.84:50710: EOF" Apr 17 17:57:34.160874 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:34.160854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:34.271689 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:34.271658 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:34.281992 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:34.281972 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:35.287840 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:35.287811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:35.370568 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:35.370539 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:35.380943 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:35.380920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:36.380637 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:36.380600 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:36.455445 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:36.455418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:36.464551 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:36.464527 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:37.435788 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:37.435754 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:37.522717 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:37.522692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:37.535396 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:37.535373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:38.560918 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:38.560894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:38.644636 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:38.644604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:38.667689 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:38.667670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:39.654204 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:39.654172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:39.733895 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:39.733868 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:39.744435 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:39.744413 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:40.829876 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:40.829848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:40.911433 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:40.911403 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:40.933570 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:40.933546 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:41.984952 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:41.984923 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:42.119877 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:42.119846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:42.136213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:42.136189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:43.178491 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:43.178464 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:43.257224 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:43.257195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:43.267406 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:43.267383 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:44.258247 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:44.258222 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:44.349910 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:44.349884 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:44.363720 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:44.363697 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:45.369290 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:45.369257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:45.446456 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:45.446429 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:45.463293 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:45.463273 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:46.425963 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:46.425933 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:46.495286 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:46.495256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:46.505754 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:46.505736 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:47.493273 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:47.493249 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:47.599414 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:47.599369 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:47.615592 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:47.615560 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:48.616344 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:48.616317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-q97s8_1c54b1a9-e44d-4291-acc0-695eaeefb002/istio-proxy/0.log" Apr 17 17:57:48.699079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:48.699051 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/main/0.log" Apr 17 17:57:48.709307 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:48.709284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp_ca78991e-2860-4ef0-8c44-de6fd51082f3/storage-initializer/0.log" Apr 17 17:57:48.988076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:48.988056 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:57:49.116310 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116285 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116332 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116373 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116404 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7hsn\" (UniqueName: \"kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116443 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116437 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116673 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116576 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm\") pod \"ca78991e-2860-4ef0-8c44-de6fd51082f3\" (UID: \"ca78991e-2860-4ef0-8c44-de6fd51082f3\") " Apr 17 17:57:49.116823 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116575 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache" (OuterVolumeSpecName: "model-cache") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.116823 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116744 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home" (OuterVolumeSpecName: "home") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.116980 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116884 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-home\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.116980 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.116903 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-model-cache\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.118541 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.118518 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:57:49.118628 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.118546 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm" (OuterVolumeSpecName: "dshm") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.118628 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.118601 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn" (OuterVolumeSpecName: "kube-api-access-t7hsn") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "kube-api-access-t7hsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:57:49.181703 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.181649 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca78991e-2860-4ef0-8c44-de6fd51082f3" (UID: "ca78991e-2860-4ef0-8c44-de6fd51082f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.218112 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.218087 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-kserve-provision-location\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.218112 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.218108 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78991e-2860-4ef0-8c44-de6fd51082f3-tls-certs\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.218248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.218119 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7hsn\" (UniqueName: \"kubernetes.io/projected/ca78991e-2860-4ef0-8c44-de6fd51082f3-kube-api-access-t7hsn\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.218248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.218128 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca78991e-2860-4ef0-8c44-de6fd51082f3-dshm\") on node \"ip-10-0-139-84.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.749312 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.749277 2573 generic.go:358] "Generic (PLEG): container finished" podID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerID="2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51" exitCode=137 Apr 17 17:57:49.749743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.749343 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" Apr 17 17:57:49.749743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.749363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerDied","Data":"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51"} Apr 17 17:57:49.749743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.749409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp" event={"ID":"ca78991e-2860-4ef0-8c44-de6fd51082f3","Type":"ContainerDied","Data":"0dde35c4b49a3d6883b6cc66c6c55bd90de353658e6322d7959c963323132e0a"} Apr 17 17:57:49.749743 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.749433 2573 scope.go:117] "RemoveContainer" containerID="2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51" Apr 17 17:57:49.770564 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.770234 2573 scope.go:117] "RemoveContainer" containerID="0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd" Apr 17 17:57:49.772440 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.772421 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:57:49.776922 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.776902 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-c88b786f7-c7kxp"] Apr 17 17:57:49.842441 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.842420 2573 scope.go:117] "RemoveContainer" containerID="2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51" Apr 17 17:57:49.842713 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:57:49.842693 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51\": container with ID starting with 2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51 not found: ID does not exist" containerID="2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51" Apr 17 17:57:49.842774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.842723 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51"} err="failed to get container status \"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51\": rpc error: code = NotFound desc = could not find container \"2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51\": container with ID starting with 2b7aa6fa42be530d642d05995fed0fdd8740ab8070a11da45992df34515ffb51 not found: ID does not exist" Apr 17 17:57:49.842774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.842745 2573 scope.go:117] "RemoveContainer" containerID="0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd" Apr 17 17:57:49.843010 ip-10-0-139-84 kubenswrapper[2573]: E0417 17:57:49.842992 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd\": container with ID starting with 0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd not found: ID does not exist" containerID="0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd" Apr 17 17:57:49.843090 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:49.843028 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd"} err="failed to get container status \"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd\": rpc error: code = NotFound desc = could not find container \"0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd\": container with ID starting with 0f648687a6c3ff4b30e4dc0278bf7e71e4c97b5e7d75e7b196ec4d16c0760acd not found: ID does not exist" Apr 17 17:57:51.196467 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:51.196434 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" path="/var/lib/kubelet/pods/ca78991e-2860-4ef0-8c44-de6fd51082f3/volumes" Apr 17 17:57:51.616392 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:51.616356 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-czcpz_54a34ad5-8be6-4d23-ab29-da07334381ed/manager/0.log" Apr 17 17:57:51.682670 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:51.682646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-5rwgs_d9d29c98-87be-4bb4-9834-786686ba286d/manager/0.log" Apr 17 17:57:51.692856 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:51.692829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-2qgzn_432af95c-352c-46b0-918c-f2d2a5abd0c4/limitador/0.log" Apr 17 17:57:54.099663 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099623 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6j5hz/must-gather-jjn8s"] Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099934 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099947 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099959 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099966 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099983 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="llm-d-routing-sidecar" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.099992 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="llm-d-routing-sidecar" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100001 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100009 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100042 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100050 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="storage-initializer" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100059 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100066 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100079 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" Apr 17 17:57:54.100079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100085 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" Apr 17 17:57:54.100562 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100139 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="main" Apr 17 17:57:54.100562 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100148 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca78991e-2860-4ef0-8c44-de6fd51082f3" containerName="main" Apr 17 17:57:54.100562 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100157 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="030ac2a5-cc7d-49c9-ae10-69cc1c17fd8b" containerName="main" Apr 17 17:57:54.100562 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.100163 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c6c05e1-9ed7-47cf-b3d9-fb52c0dcbaf8" containerName="llm-d-routing-sidecar" Apr 17 17:57:54.104908 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.104889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.107611 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.107594 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6j5hz\"/\"default-dockercfg-f7m6d\"" Apr 17 17:57:54.108227 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.108211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"kube-root-ca.crt\"" Apr 17 17:57:54.108281 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.108235 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"openshift-service-ca.crt\"" Apr 17 17:57:54.121049 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.121006 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/must-gather-jjn8s"] Apr 17 17:57:54.157272 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.157251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9xz\" (UniqueName: \"kubernetes.io/projected/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-kube-api-access-nf9xz\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.157370 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.157281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-must-gather-output\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.258630 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.258608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9xz\" (UniqueName: \"kubernetes.io/projected/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-kube-api-access-nf9xz\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.258713 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.258637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-must-gather-output\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.258935 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.258921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-must-gather-output\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.271943 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.271921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9xz\" (UniqueName: \"kubernetes.io/projected/1ccd2da7-1797-4bd4-98b8-f8bada4b31f0-kube-api-access-nf9xz\") pod \"must-gather-jjn8s\" (UID: \"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0\") " pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.414084 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.414063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" Apr 17 17:57:54.538304 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.538269 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/must-gather-jjn8s"] Apr 17 17:57:54.541072 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:57:54.541042 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ccd2da7_1797_4bd4_98b8_f8bada4b31f0.slice/crio-4e4862fb9f056bb62efc7e8fccc946ee92f8bc07d5b0e865a1b9d93d499ba822 WatchSource:0}: Error finding container 4e4862fb9f056bb62efc7e8fccc946ee92f8bc07d5b0e865a1b9d93d499ba822: Status 404 returned error can't find the container with id 4e4862fb9f056bb62efc7e8fccc946ee92f8bc07d5b0e865a1b9d93d499ba822 Apr 17 17:57:54.542864 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.542845 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:57:54.765511 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:54.765441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" event={"ID":"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0","Type":"ContainerStarted","Data":"4e4862fb9f056bb62efc7e8fccc946ee92f8bc07d5b0e865a1b9d93d499ba822"} Apr 17 17:57:55.770469 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:55.770436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" event={"ID":"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0","Type":"ContainerStarted","Data":"1547976fab3926d72f689f4316bc03d3bfd744ba6130b4116b705aa89e4f57c7"} Apr 17 17:57:55.770469 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:55.770475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" event={"ID":"1ccd2da7-1797-4bd4-98b8-f8bada4b31f0","Type":"ContainerStarted","Data":"c7564defb24396a09e9eb2b0730b5d99bbbaf947f1547bface9bfc225ba309fe"} Apr 17 17:57:55.789893 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:55.789819 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6j5hz/must-gather-jjn8s" podStartSLOduration=1.070785039 podStartE2EDuration="1.789806549s" podCreationTimestamp="2026-04-17 17:57:54 +0000 UTC" firstStartedPulling="2026-04-17 17:57:54.542970098 +0000 UTC m=+2019.859051918" lastFinishedPulling="2026-04-17 17:57:55.261991603 +0000 UTC m=+2020.578073428" observedRunningTime="2026-04-17 17:57:55.787680795 +0000 UTC m=+2021.103762637" watchObservedRunningTime="2026-04-17 17:57:55.789806549 +0000 UTC m=+2021.105888390" Apr 17 17:57:56.986736 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:56.986706 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tkbg5_a32594c3-cfcc-4c83-88d7-263f8c57a927/global-pull-secret-syncer/0.log" Apr 17 17:57:57.081794 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:57.081753 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jwn7v_5d28da5f-188e-4c8f-a4ec-2178fc14de55/konnectivity-agent/0.log" Apr 17 17:57:57.191035 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:57:57.190989 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-84.ec2.internal_243a441188fa30c2a72c720d3a8cc5f2/haproxy/0.log" Apr 17 17:58:01.146635 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:01.146602 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-czcpz_54a34ad5-8be6-4d23-ab29-da07334381ed/manager/0.log" Apr 17 17:58:01.266213 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:01.266122 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-5rwgs_d9d29c98-87be-4bb4-9834-786686ba286d/manager/0.log" Apr 17 17:58:01.309546 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:01.309514 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-2qgzn_432af95c-352c-46b0-918c-f2d2a5abd0c4/limitador/0.log" Apr 17 17:58:02.595920 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:02.595820 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lklgp_5d29d095-2263-4fd2-98c2-ee342220c960/kube-state-metrics/0.log" Apr 17 17:58:02.623973 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:02.623932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lklgp_5d29d095-2263-4fd2-98c2-ee342220c960/kube-rbac-proxy-main/0.log" Apr 17 17:58:02.655862 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:02.655837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lklgp_5d29d095-2263-4fd2-98c2-ee342220c960/kube-rbac-proxy-self/0.log" Apr 17 17:58:02.993684 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:02.993655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndc6q_bd7121ef-8833-411e-9ab4-3de1db83ef61/node-exporter/0.log" Apr 17 17:58:03.065675 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.065601 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndc6q_bd7121ef-8833-411e-9ab4-3de1db83ef61/kube-rbac-proxy/0.log" Apr 17 17:58:03.100283 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.100249 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndc6q_bd7121ef-8833-411e-9ab4-3de1db83ef61/init-textfile/0.log" Apr 17 17:58:03.134642 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.134604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-24sl4_e9b9ea6f-26a0-4cca-b8be-5bedbf607826/kube-rbac-proxy-main/0.log" Apr 17 17:58:03.166407 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.166337 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-24sl4_e9b9ea6f-26a0-4cca-b8be-5bedbf607826/kube-rbac-proxy-self/0.log" Apr 17 17:58:03.197508 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.197432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-24sl4_e9b9ea6f-26a0-4cca-b8be-5bedbf607826/openshift-state-metrics/0.log" Apr 17 17:58:03.244001 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.243973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/prometheus/0.log" Apr 17 17:58:03.266328 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.266298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/config-reloader/0.log" Apr 17 17:58:03.295894 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.295864 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/thanos-sidecar/0.log" Apr 17 17:58:03.322457 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.322426 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/kube-rbac-proxy-web/0.log" Apr 17 17:58:03.355382 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.355347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/kube-rbac-proxy/0.log" Apr 17 17:58:03.390769 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.390737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/kube-rbac-proxy-thanos/0.log" Apr 17 17:58:03.420462 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:03.420434 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7e87163c-fc87-4aa7-b0f1-a92c13c69a22/init-config-reloader/0.log" Apr 17 17:58:05.404184 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.404153 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt"] Apr 17 17:58:05.408872 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.408854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.424713 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.424686 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt"] Apr 17 17:58:05.562462 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.562425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-sys\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.562653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.562482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-podres\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.562653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.562524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwhd6\" (UniqueName: \"kubernetes.io/projected/045f4be7-8051-472c-ae2b-06e20300270e-kube-api-access-gwhd6\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.562653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.562585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-lib-modules\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.562653 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.562637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-proc\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-lib-modules\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663306 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-proc\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-sys\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-podres\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwhd6\" (UniqueName: \"kubernetes.io/projected/045f4be7-8051-472c-ae2b-06e20300270e-kube-api-access-gwhd6\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-lib-modules\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-sys\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663428 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-proc\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.663507 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.663501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/045f4be7-8051-472c-ae2b-06e20300270e-podres\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.672248 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.672225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwhd6\" (UniqueName: \"kubernetes.io/projected/045f4be7-8051-472c-ae2b-06e20300270e-kube-api-access-gwhd6\") pod \"perf-node-gather-daemonset-r48rt\" (UID: \"045f4be7-8051-472c-ae2b-06e20300270e\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.721298 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.721273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:05.866906 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:05.866854 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt"] Apr 17 17:58:05.871806 ip-10-0-139-84 kubenswrapper[2573]: W0417 17:58:05.871772 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod045f4be7_8051_472c_ae2b_06e20300270e.slice/crio-40871a2455b8c31da5782288b591a48ecc2bbf6183e689c896565330af16531d WatchSource:0}: Error finding container 40871a2455b8c31da5782288b591a48ecc2bbf6183e689c896565330af16531d: Status 404 returned error can't find the container with id 40871a2455b8c31da5782288b591a48ecc2bbf6183e689c896565330af16531d Apr 17 17:58:06.818118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:06.818078 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" event={"ID":"045f4be7-8051-472c-ae2b-06e20300270e","Type":"ContainerStarted","Data":"3da785a61d2e427f9fc185d52f955ee7fac6c52e3eee9676d30cc9618c7f70cf"} Apr 17 17:58:06.818118 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:06.818123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" event={"ID":"045f4be7-8051-472c-ae2b-06e20300270e","Type":"ContainerStarted","Data":"40871a2455b8c31da5782288b591a48ecc2bbf6183e689c896565330af16531d"} Apr 17 17:58:06.818512 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:06.818203 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:06.838675 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:06.838619 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" podStartSLOduration=1.838601119 podStartE2EDuration="1.838601119s" podCreationTimestamp="2026-04-17 17:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:58:06.836773266 +0000 UTC m=+2032.152855121" watchObservedRunningTime="2026-04-17 17:58:06.838601119 +0000 UTC m=+2032.154682965" Apr 17 17:58:07.348186 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:07.348153 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77nfq_85412bc5-20cb-438b-9637-8f85717abf24/dns/0.log" Apr 17 17:58:07.376258 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:07.376231 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77nfq_85412bc5-20cb-438b-9637-8f85717abf24/kube-rbac-proxy/0.log" Apr 17 17:58:07.524735 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:07.524705 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l6j9_9b0827a8-3eb5-4863-995e-e708ba3f0756/dns-node-resolver/0.log" Apr 17 17:58:08.177079 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:08.177052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whd7s_96e2a5d4-ba15-434e-9514-847bbd7fec29/node-ca/0.log" Apr 17 17:58:09.649521 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:09.649497 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l4qxr_967a37c4-4cd4-49ce-9611-47bd8e1bf9dd/serve-healthcheck-canary/0.log" Apr 17 17:58:10.389475 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:10.389450 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xrs4v_20ec495a-dce4-465c-b00c-8812ae039f7b/kube-rbac-proxy/0.log" Apr 17 17:58:10.416758 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:10.416738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xrs4v_20ec495a-dce4-465c-b00c-8812ae039f7b/exporter/0.log" Apr 17 17:58:10.441476 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:10.441453 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xrs4v_20ec495a-dce4-465c-b00c-8812ae039f7b/extractor/0.log" Apr 17 17:58:12.835762 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:12.835729 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-r48rt" Apr 17 17:58:13.137774 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:13.137742 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6dd684f56d-spb9f_df9c0da7-c1ce-42f4-a69f-89489de90e86/manager/0.log" Apr 17 17:58:13.194399 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:13.194373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-r5kkx_1b3d3a9b-a10b-4570-b17c-6583411b2763/openshift-lws-operator/0.log" Apr 17 17:58:21.808935 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.808899 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/kube-multus-additional-cni-plugins/0.log" Apr 17 17:58:21.837842 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.837810 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/egress-router-binary-copy/0.log" Apr 17 17:58:21.867076 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.867052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/cni-plugins/0.log" Apr 17 17:58:21.895121 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.895044 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/bond-cni-plugin/0.log" Apr 17 17:58:21.919937 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.919914 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/routeoverride-cni/0.log" Apr 17 17:58:21.948132 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.948108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/whereabouts-cni-bincopy/0.log" Apr 17 17:58:21.974806 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:21.974769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lsns4_8c1b9fc8-4b21-4201-8f92-dce101b1890a/whereabouts-cni/0.log" Apr 17 17:58:22.065416 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:22.065380 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x8vrb_b7181578-5b02-4803-bf4b-fbb5cec55a12/kube-multus/0.log" Apr 17 17:58:22.155484 ip-10-0-139-84 kubenswrapper[2573]: I0417 17:58:22.155402 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-llf4n_8a06ed33-b68b-4c09-88ed-8a7af0e52ef4/network-metrics-daemon/0.log"