Apr 16 14:50:03.715354 ip-10-0-139-47 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:50:03.715364 ip-10-0-139-47 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:50:03.715371 ip-10-0-139-47 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:50:03.715608 ip-10-0-139-47 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:15.043772 ip-10-0-139-47 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:15.043786 ip-10-0-139-47 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2f9856ae21ad4c809abc57e94d4c86bf -- Apr 16 14:52:30.693206 ip-10-0-139-47 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:31.147667 ip-10-0-139-47 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:31.147667 ip-10-0-139-47 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:31.147667 ip-10-0-139-47 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:31.147667 ip-10-0-139-47 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:31.147667 ip-10-0-139-47 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:31.149491 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.149337 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:31.152603 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152585 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:31.152603 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152604 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152607 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152611 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152614 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152618 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152621 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152624 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152627 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152630 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152634 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152639 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152642 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152645 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152648 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152651 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152653 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152656 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152658 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152661 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152663 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:31.152673 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152666 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152669 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152671 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152674 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152677 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152696 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152700 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152702 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152705 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152708 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152711 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152713 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152716 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152719 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152721 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152724 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152727 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152729 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152732 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152734 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:31.153153 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152738 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152740 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152743 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152746 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152749 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152751 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152754 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152761 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152763 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152766 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152768 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152771 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152773 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152776 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152780 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152783 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152787 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152791 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152794 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:31.153677 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152797 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152800 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152803 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152805 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152809 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152812 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152815 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152817 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152820 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152822 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152825 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152828 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152830 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152833 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152836 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152839 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152842 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152844 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152854 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152858 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:31.154163 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152860 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152863 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152866 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152868 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152871 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.152874 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153308 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153316 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153320 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153323 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153326 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153329 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153332 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153335 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153337 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153340 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153342 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153345 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153347 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153350 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:31.154652 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153353 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153356 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153358 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153361 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153364 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153366 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153368 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153371 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153376 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153379 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153382 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153385 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153388 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153390 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153393 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153395 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153398 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153401 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153404 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153406 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:31.155151 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153408 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153411 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153413 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153416 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153418 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153421 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153423 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153426 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153428 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153431 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153433 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153436 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153439 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153441 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153443 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153449 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153452 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153454 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153458 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:31.155729 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153462 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153465 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153468 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153471 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153474 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153476 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153479 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153481 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153484 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153486 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153489 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153492 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153494 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153497 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153499 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153502 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153505 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153507 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153510 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153513 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:31.156210 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153515 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153518 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153520 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153522 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153525 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153528 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153530 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153533 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153535 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153537 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153540 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153542 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.153545 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153625 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153637 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153645 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153649 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153654 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153658 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153663 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153667 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:31.156696 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153671 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153674 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153677 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153681 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153684 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153687 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153690 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153693 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153696 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153699 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153702 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153707 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153710 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153713 2579 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153721 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153725 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153729 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153732 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153735 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153738 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153741 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153744 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153748 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153751 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153754 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:31.157221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153758 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153762 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153764 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153768 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153770 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153773 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153779 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153782 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153785 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153788 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153791 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153795 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153798 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153802 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153805 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153808 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153811 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153814 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153817 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153821 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153824 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153828 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153832 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153835 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153839 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:31.157841 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153843 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153846 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153849 2579 flags.go:64] FLAG: --help="false" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153852 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153855 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153858 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153861 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153865 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153868 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153872 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153875 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153878 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153881 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153884 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153887 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153890 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153893 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153896 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153913 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153917 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153926 2579 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153931 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153935 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153938 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:31.158468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153944 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153947 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153950 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153953 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153958 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153962 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153965 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153968 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153972 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153975 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153980 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153983 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153986 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153989 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153992 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153996 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.153998 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154001 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154010 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154014 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154017 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154020 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154024 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:31.159111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154030 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154033 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154036 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154039 2579 flags.go:64] FLAG: --port="10250" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154043 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154045 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d18582494f5f4362" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154049 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154052 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154055 2579 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154058 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154060 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154065 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154068 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154070 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154075 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154082 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154085 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154089 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154092 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154095 2579 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154098 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154101 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154104 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154107 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154110 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154113 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:31.159688 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154116 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154119 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154123 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154126 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154129 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154132 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154135 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154138 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154141 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154146 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154149 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154152 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154183 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154187 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154190 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154193 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154197 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154200 2579 flags.go:64] FLAG: --v="2" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154204 2579 flags.go:64] FLAG: --version="false" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154208 2579 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154214 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.154218 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154314 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154317 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:31.160341 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154320 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154323 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154326 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154328 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154331 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154334 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154336 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154339 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154341 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154344 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154347 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154350 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154355 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154360 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154363 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154366 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154369 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154371 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154374 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:31.160933 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154377 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154379 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154382 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154385 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154387 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154390 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154392 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154395 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154397 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154402 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154404 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154407 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154409 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154412 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154414 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154417 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154419 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154422 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154425 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154427 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:31.161412 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154430 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154432 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154435 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154437 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154440 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154443 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154447 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154449 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154452 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154454 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154457 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154460 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154462 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154464 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154467 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154470 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154473 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154476 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154478 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154481 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:31.161942 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154484 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154488 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154491 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154493 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154496 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154498 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154502 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154506 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154509 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154512 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154515 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154518 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154520 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154523 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154525 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154528 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154530 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154533 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154537 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:31.162480 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154540 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154543 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154545 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154548 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154550 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.154553 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.155355 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.162509 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.162526 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162578 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162584 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162587 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162591 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162594 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162596 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:31.162974 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162599 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162601 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162604 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162606 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162609 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162611 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162614 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162617 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162619 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162622 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162625 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162627 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162630 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162632 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162635 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162637 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162640 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162643 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162645 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162649 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:31.163352 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162652 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162655 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162657 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162660 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162664 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162669 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162672 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162675 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162678 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162681 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162684 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162686 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162689 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162692 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162694 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162697 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162699 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162702 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162705 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162707 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:31.163843 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162710 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162712 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162715 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162720 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162724 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162728 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162731 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162734 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162737 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162739 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162742 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162745 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162747 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162750 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162753 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162755 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162758 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162761 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162764 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162767 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:31.164351 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162769 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162772 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162774 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162777 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162780 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162782 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162785 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162788 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162790 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162793 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162795 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162798 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162800 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162803 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162805 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162809 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162812 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162814 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162817 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:31.164830 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162819 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.162825 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162946 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162952 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162955 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162958 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162961 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162964 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162967 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162969 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162972 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162974 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162978 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162981 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162984 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:31.165300 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162987 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162989 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162992 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162995 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.162997 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163000 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163002 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163005 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163008 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163010 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163012 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163015 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163018 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163021 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163023 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163026 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163028 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163031 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163034 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163037 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:31.165678 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163040 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163042 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163045 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163047 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163050 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163052 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163055 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163057 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163060 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163062 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163065 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163068 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163071 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163073 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163076 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163079 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163081 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163085 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163087 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163090 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:31.166195 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163092 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163095 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163097 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163100 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163103 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163106 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163108 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163112 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163116 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163119 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163121 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163124 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163127 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163129 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163132 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163134 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163137 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163140 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163142 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:31.166684 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163145 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163147 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163150 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163153 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163156 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163159 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163161 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163164 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163167 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163169 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163172 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163174 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163177 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:31.163179 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.163184 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:31.167287 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.163870 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:31.167648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.166318 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:31.167648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.167259 2579 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:31.167648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.167355 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:31.167648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.167401 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:31.191786 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.191762 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:31.195083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.195060 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:31.208422 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.208395 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:31.213541 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.213521 2579 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:31.214839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.214811 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:31.217191 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.217164 2579 fs.go:135] Filesystem UUIDs: map[3dbe4768-f618-44b1-8d77-66e62a419438:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8818b04e-d4fe-478f-93af-2c1b9e21355f:/dev/nvme0n1p3] Apr 16 14:52:31.217277 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.217187 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:31.223364 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.223246 2579 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:31.220948932 +0000 UTC m=+0.403486717 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105356 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e142bb26bc38fd1dd20c501cf4b3b SystemUUID:ec2e142b-b26b-c38f-d1dd-20c501cf4b3b BootID:2f9856ae-21ad-4c80-9abc-57e94d4c86bf Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3d:bb:2f:0c:95 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3d:bb:2f:0c:95 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:e2:44:37:41:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:31.223364 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.223363 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:31.223485 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.223456 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:31.224317 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224283 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:31.224463 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224319 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:31.224509 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224473 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:31.224509 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224482 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:31.224509 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224495 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:31.224509 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.224508 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:31.225348 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.225333 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:31.225453 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.225444 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:31.227427 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.227407 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:31.227986 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.227974 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:31.228036 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.227997 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:31.228036 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.228009 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:31.228036 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.228018 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:31.228036 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.228029 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:31.229090 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.229079 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:31.229129 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.229099 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:31.231960 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.231945 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:31.233628 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.233611 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:31.234956 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.234940 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.234967 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.234980 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.234991 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235001 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235014 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235024 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:31.235039 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235035 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:31.235269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235047 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:31.235269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235057 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:31.235269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235086 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:31.235269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.235101 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:31.236241 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.236230 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:31.236306 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.236244 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:31.239935 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.239888 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:31.240092 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.240081 2579 server.go:1295] "Started kubelet" Apr 16 14:52:31.241472 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.241353 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:31.241596 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.241531 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:31.241718 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.240138 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:31.241857 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.241838 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:31.241970 ip-10-0-139-47 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:31.242053 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.241963 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:31.242053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.242018 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:31.243210 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.243085 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:31.246713 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.246690 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:31.249702 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.248660 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-47.ec2.internal.18a6ddf5e602e8b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-47.ec2.internal,UID:ip-10-0-139-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-47.ec2.internal,},FirstTimestamp:2026-04-16 14:52:31.239915702 +0000 UTC m=+0.422453471,LastTimestamp:2026-04-16 14:52:31.239915702 +0000 UTC m=+0.422453471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-47.ec2.internal,}" Apr 16 14:52:31.250792 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.250776 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:31.251214 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.251201 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:31.251275 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.251203 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:31.251877 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.251857 2579 factory.go:55] Registering systemd factory Apr 16 14:52:31.251877 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.251878 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:31.252036 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252019 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:31.252098 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252072 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:31.252098 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252090 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:31.252191 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252119 2579 factory.go:153] Registering CRI-O factory Apr 16 14:52:31.252191 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252133 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252191 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252195 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252201 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.252217 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252228 2579 factory.go:103] Registering Raw factory Apr 16 14:52:31.252269 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252243 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:31.252674 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.252661 2579 manager.go:319] Starting recovery of all containers Apr 16 14:52:31.262979 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.262949 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:31.263084 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.263033 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:31.264023 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.264005 2579 manager.go:324] Recovery completed Apr 16 14:52:31.269875 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.269746 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.272430 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.272411 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.272508 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.272449 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.272508 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.272464 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.273057 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.273036 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:31.273057 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.273054 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:31.273160 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.273074 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:31.275257 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.275245 2579 policy_none.go:49] "None policy: Start" Apr 16 14:52:31.275297 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.275261 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:31.276660 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.276650 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:31.285938 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.285840 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-47.ec2.internal.18a6ddf5e7f30e88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-47.ec2.internal,UID:ip-10-0-139-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-47.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-47.ec2.internal,},FirstTimestamp:2026-04-16 14:52:31.27243124 +0000 UTC m=+0.454969010,LastTimestamp:2026-04-16 14:52:31.27243124 +0000 UTC m=+0.454969010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-47.ec2.internal,}" Apr 16 14:52:31.296934 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.296894 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4nscn" Apr 16 14:52:31.311169 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.311150 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4nscn" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.312110 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-47.ec2.internal.18a6ddf5e7f36e0d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-47.ec2.internal,UID:ip-10-0-139-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-139-47.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-139-47.ec2.internal,},FirstTimestamp:2026-04-16 14:52:31.272455693 +0000 UTC m=+0.454993458,LastTimestamp:2026-04-16 14:52:31.272455693 +0000 UTC m=+0.454993458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-47.ec2.internal,}" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.320519 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.320556 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.320569 2579 server.go:85] "Starting device plugin registration server" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.320835 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.320849 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.320953 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.321032 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.321042 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.321718 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:31.324513 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.321747 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.348266 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.348236 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:31.349486 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.349462 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:31.349588 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.349492 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:31.349588 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.349515 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:31.349588 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.349522 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:31.349588 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.349556 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:31.365075 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.365049 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.421541 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.421478 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.423685 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.423661 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.423811 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.423694 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.423811 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.423706 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.423811 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.423734 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.431013 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.430997 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.431080 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.431021 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-47.ec2.internal\": node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.449627 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.449599 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal"] Apr 16 14:52:31.449767 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.449671 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.450572 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.450554 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.450638 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.450587 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.450638 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.450598 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.451729 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.451715 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.451872 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.451854 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.451953 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.451896 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.452679 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452664 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.452679 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452675 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.452761 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452699 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.452761 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452707 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.452761 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452709 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.452761 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.452721 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.454475 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.454462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.454550 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.454485 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.454758 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.454744 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.455294 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.455280 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.455345 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.455305 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.455345 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.455317 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.479846 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.479819 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-47.ec2.internal\" not found" node="ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.484464 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.484439 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-47.ec2.internal\" not found" node="ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.552804 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.552778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.552804 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.552808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.553019 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.552834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/da3e9a6a335603c20524d6311693f61c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-47.ec2.internal\" (UID: \"da3e9a6a335603c20524d6311693f61c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.555850 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.555831 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.653308 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.653396 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.653396 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/da3e9a6a335603c20524d6311693f61c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-47.ec2.internal\" (UID: \"da3e9a6a335603c20524d6311693f61c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.653458 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.653458 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/da3e9a6a335603c20524d6311693f61c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-47.ec2.internal\" (UID: \"da3e9a6a335603c20524d6311693f61c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.653458 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.653435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c5598820f1d3314f2d1a9302b66f570-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal\" (UID: \"0c5598820f1d3314f2d1a9302b66f570\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.656344 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.656326 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.757265 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.757185 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.784387 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.784362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.786897 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:31.786878 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:31.857505 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.857463 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:31.958052 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:31.958022 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:32.058601 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.058508 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-47.ec2.internal\" not found" Apr 16 14:52:32.119963 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.119931 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:32.151763 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.151709 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" Apr 16 14:52:32.167118 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.167092 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:32.167507 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.167476 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:32.167627 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.167611 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:32.168625 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.168608 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" Apr 16 14:52:32.184412 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.184384 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:32.228713 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.228690 2579 apiserver.go:52] "Watching apiserver" Apr 16 14:52:32.236896 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.236873 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:32.237274 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.237249 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-96845","openshift-dns/node-resolver-mvlp4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal","openshift-multus/multus-additional-cni-plugins-dtvll","openshift-multus/multus-flwwz","openshift-multus/network-metrics-daemon-vpmk7","kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal","openshift-image-registry/node-ca-dl2m4","openshift-network-diagnostics/network-check-target-qh75b","openshift-network-operator/iptables-alerter-nsdp5","openshift-ovn-kubernetes/ovnkube-node-wrxcw","kube-system/konnectivity-agent-8jpq8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4"] Apr 16 14:52:32.239955 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.239936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.240135 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.240101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.241159 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.241138 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.243085 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243065 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.243187 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243073 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:32.243187 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243132 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.243425 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243408 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.243942 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243896 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nqc6k\"" Apr 16 14:52:32.243942 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.243921 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:32.244123 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244066 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.244178 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244126 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.244178 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244161 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.244298 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.244298 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.244208 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:32.244298 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244237 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.244443 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244302 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h65jw\"" Apr 16 14:52:32.244476 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:32.244607 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244593 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.244645 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244607 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:32.244645 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.244620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdc67\"" Apr 16 14:52:32.245232 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.245218 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lcdvs\"" Apr 16 14:52:32.245232 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.245228 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:32.245317 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.245288 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:32.245635 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.245623 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:32.246223 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.246209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.247084 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.247045 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.247084 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.247066 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:32.247084 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.247074 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-t8kzr\"" Apr 16 14:52:32.247277 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.247066 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.248550 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.248530 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.248638 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.248549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.248638 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.248582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.249167 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.249150 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.249280 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.249265 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x74kr\"" Apr 16 14:52:32.250932 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.250897 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.251288 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.251273 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.251634 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.252005 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.252031 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.252392 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.252674 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.253203 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.252932 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nvq22\"" Apr 16 14:52:32.253567 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.253320 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.253877 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.254293 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.254587 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.255028 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.255054 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:32.255314 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.255227 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qk77r\"" Apr 16 14:52:32.255630 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.255350 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5ds8c\"" Apr 16 14:52:32.256415 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256394 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.256535 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256425 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:32.256535 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256529 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgjx\" (UniqueName: \"kubernetes.io/projected/ba7ca911-2481-4fe7-9079-e770b1840406-kube-api-access-5jgjx\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.256704 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-device-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.256704 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-sys-fs\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.256805 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15789381-f4bd-4597-8239-7c3b98a87e12-iptables-alerter-script\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.256805 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256750 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-socket-dir-parent\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.256805 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-k8s-cni-cncf-io\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.256964 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-log-socket\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.256964 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-system-cni-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.256964 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-netns\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.256975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-bin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-etc-kubernetes\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257034 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-systemd-units\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.257111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-etc-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.257111 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.257441 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-lib-modules\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.257531 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-os-release\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257531 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-konnectivity-ca\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.257531 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:32.257531 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257527 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-modprobe-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qfs\" (UniqueName: \"kubernetes.io/projected/ac641325-3949-4503-86d7-77e5ededa110-kube-api-access-p5qfs\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257585 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-os-release\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe018-bd6c-490e-9197-d5ea9c881a92-host\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-kubelet\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8797\" (UniqueName: \"kubernetes.io/projected/16b9981c-80e1-46aa-90b5-651d968a8850-kube-api-access-b8797\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.257717 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-kubelet\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257796 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-ovn\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0f8acc5-b88a-4adf-b2de-441b178001bf-hosts-file\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-binary-copy\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-host\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.258025 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.257999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-multus-certs\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-slash\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-node-log\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe018-bd6c-490e-9197-d5ea9c881a92-serviceca\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15789381-f4bd-4597-8239-7c3b98a87e12-host-slash\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-cnibin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-conf-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-bin\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-config\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-agent-certs\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0f8acc5-b88a-4adf-b2de-441b178001bf-tmp-dir\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-kubernetes\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cnibin\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.258379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5qr\" (UniqueName: \"kubernetes.io/projected/15789381-f4bd-4597-8239-7c3b98a87e12-kube-api-access-gw5qr\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258405 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysconfig\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258427 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-sys\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5fm\" (UniqueName: \"kubernetes.io/projected/7c9231d4-b789-4d31-9ab1-0514edc9a681-kube-api-access-vl5fm\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-tmp\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-registration-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-netns\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-multus\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-system-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbzj\" (UniqueName: \"kubernetes.io/projected/da8fe018-bd6c-490e-9197-d5ea9c881a92-kube-api-access-tgbzj\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-socket-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-systemd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-netd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259078 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac641325-3949-4503-86d7-77e5ededa110-ovn-node-metrics-cert\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-conf\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-systemd\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-etc-tuned\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-etc-selinux\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-cni-binary-copy\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-hostroot\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.258972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-multus-daemon-config\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-var-lib-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259034 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-env-overrides\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj66\" (UniqueName: \"kubernetes.io/projected/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-kube-api-access-9xj66\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-run\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-var-lib-kubelet\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-script-lib\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkxq\" (UniqueName: \"kubernetes.io/projected/c0f8acc5-b88a-4adf-b2de-441b178001bf-kube-api-access-xwkxq\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.259839 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.259155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxgj\" (UniqueName: \"kubernetes.io/projected/f9f0740c-e453-47f4-a40d-564dd1731056-kube-api-access-kzxgj\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.267251 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.267228 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:32.288500 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.288477 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nhbg4" Apr 16 14:52:32.297361 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.297340 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nhbg4" Apr 16 14:52:32.312777 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.312699 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:31 +0000 UTC" deadline="2027-10-22 00:20:24.751005894 +0000 UTC" Apr 16 14:52:32.312777 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.312726 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13281h27m52.438282475s" Apr 16 14:52:32.316318 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.316296 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:32.337518 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.337484 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3e9a6a335603c20524d6311693f61c.slice/crio-ebcb2259ab0376a4f791ca34a60326c70b93fbcaa9146d48526da68f9a2fddaa WatchSource:0}: Error finding container ebcb2259ab0376a4f791ca34a60326c70b93fbcaa9146d48526da68f9a2fddaa: Status 404 returned error can't find the container with id ebcb2259ab0376a4f791ca34a60326c70b93fbcaa9146d48526da68f9a2fddaa Apr 16 14:52:32.337893 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.337869 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5598820f1d3314f2d1a9302b66f570.slice/crio-f813bc3bf3803575a9936112530e3cf920f2d87a31acbe68a30dd337d2cda89b WatchSource:0}: Error finding container f813bc3bf3803575a9936112530e3cf920f2d87a31acbe68a30dd337d2cda89b: Status 404 returned error can't find the container with id f813bc3bf3803575a9936112530e3cf920f2d87a31acbe68a30dd337d2cda89b Apr 16 14:52:32.342173 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.342143 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:32.352660 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.352611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" event={"ID":"da3e9a6a335603c20524d6311693f61c","Type":"ContainerStarted","Data":"ebcb2259ab0376a4f791ca34a60326c70b93fbcaa9146d48526da68f9a2fddaa"} Apr 16 14:52:32.353670 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.353650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" event={"ID":"0c5598820f1d3314f2d1a9302b66f570","Type":"ContainerStarted","Data":"f813bc3bf3803575a9936112530e3cf920f2d87a31acbe68a30dd337d2cda89b"} Apr 16 14:52:32.360255 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-script-lib\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.360255 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkxq\" (UniqueName: \"kubernetes.io/projected/c0f8acc5-b88a-4adf-b2de-441b178001bf-kube-api-access-xwkxq\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxgj\" (UniqueName: \"kubernetes.io/projected/f9f0740c-e453-47f4-a40d-564dd1731056-kube-api-access-kzxgj\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jgjx\" (UniqueName: \"kubernetes.io/projected/ba7ca911-2481-4fe7-9079-e770b1840406-kube-api-access-5jgjx\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360447 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-device-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-sys-fs\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15789381-f4bd-4597-8239-7c3b98a87e12-iptables-alerter-script\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-socket-dir-parent\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-k8s-cni-cncf-io\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-log-socket\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-system-cni-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-netns\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-bin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-etc-kubernetes\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-systemd-units\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-etc-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.362316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-script-lib\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-lib-modules\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-os-release\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.360997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-konnectivity-ca\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-system-cni-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-modprobe-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qfs\" (UniqueName: \"kubernetes.io/projected/ac641325-3949-4503-86d7-77e5ededa110-kube-api-access-p5qfs\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-sys-fs\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-os-release\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe018-bd6c-490e-9197-d5ea9c881a92-host\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-kubelet\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8797\" (UniqueName: \"kubernetes.io/projected/16b9981c-80e1-46aa-90b5-651d968a8850-kube-api-access-b8797\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-kubelet\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-kubelet\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-netns\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363053 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-device-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361656 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15789381-f4bd-4597-8239-7c3b98a87e12-iptables-alerter-script\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-socket-dir-parent\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe018-bd6c-490e-9197-d5ea9c881a92-host\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-modprobe-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-k8s-cni-cncf-io\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361941 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-konnectivity-ca\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-etc-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.361985 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-bin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-systemd-units\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-ovn\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0f8acc5-b88a-4adf-b2de-441b178001bf-hosts-file\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-ovn\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-os-release\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-os-release\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.363764 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-lib-modules\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362272 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-kubelet\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0f8acc5-b88a-4adf-b2de-441b178001bf-hosts-file\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362329 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-d\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-log-socket\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-binary-copy\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-host\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-host\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-multus-certs\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362523 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-slash\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362625 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-node-log\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe018-bd6c-490e-9197-d5ea9c881a92-serviceca\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15789381-f4bd-4597-8239-7c3b98a87e12-host-slash\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-cnibin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.364558 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-conf-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-bin\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-config\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.362972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-agent-certs\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0f8acc5-b88a-4adf-b2de-441b178001bf-tmp-dir\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-kubernetes\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cnibin\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-binary-copy\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5qr\" (UniqueName: \"kubernetes.io/projected/15789381-f4bd-4597-8239-7c3b98a87e12-kube-api-access-gw5qr\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysconfig\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-sys\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5fm\" (UniqueName: \"kubernetes.io/projected/7c9231d4-b789-4d31-9ab1-0514edc9a681-kube-api-access-vl5fm\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363345 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-run-multus-certs\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-kubernetes\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0f8acc5-b88a-4adf-b2de-441b178001bf-tmp-dir\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.365379 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363422 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-cnibin\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15789381-f4bd-4597-8239-7c3b98a87e12-host-slash\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cnibin\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-etc-kubernetes\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysconfig\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363954 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.363836 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-sys\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe018-bd6c-490e-9197-d5ea9c881a92-serviceca\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-tmp\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-registration-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-node-log\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-netns\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364329 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-run-netns\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-slash\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-registration-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.366083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.364563 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-multus-conf-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-bin\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.364634 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.864608235 +0000 UTC m=+2.047146010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-multus\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-system-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbzj\" (UniqueName: \"kubernetes.io/projected/da8fe018-bd6c-490e-9197-d5ea9c881a92-kube-api-access-tgbzj\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364808 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-socket-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-host-var-lib-cni-multus\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-systemd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-system-cni-dir\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.364979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-socket-dir\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365017 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-run-systemd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365068 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-netd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-host-cni-netd\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac641325-3949-4503-86d7-77e5ededa110-ovn-node-metrics-cert\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.366733 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-ovnkube-config\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-conf\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-systemd\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-etc-tuned\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-sysctl-conf\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-etc-selinux\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-etc-systemd\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-cni-binary-copy\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365489 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c9231d4-b789-4d31-9ab1-0514edc9a681-etc-selinux\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-hostroot\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-multus-daemon-config\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-var-lib-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-env-overrides\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16b9981c-80e1-46aa-90b5-651d968a8850-hostroot\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365874 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac641325-3949-4503-86d7-77e5ededa110-var-lib-openvswitch\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj66\" (UniqueName: \"kubernetes.io/projected/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-kube-api-access-9xj66\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.365978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-run\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.366538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-var-lib-kubelet\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.366616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-cni-binary-copy\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.366622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-var-lib-kubelet\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.366474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9f0740c-e453-47f4-a40d-564dd1731056-run\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.366766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac641325-3949-4503-86d7-77e5ededa110-env-overrides\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.367016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16b9981c-80e1-46aa-90b5-651d968a8850-multus-daemon-config\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.367760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac641325-3949-4503-86d7-77e5ededa110-ovn-node-metrics-cert\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.367780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-tmp\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.367970 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.367936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9f0740c-e453-47f4-a40d-564dd1731056-etc-tuned\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.368292 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.368025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aabf9cf2-dcc2-4a42-b211-e34be354e3ed-agent-certs\") pod \"konnectivity-agent-8jpq8\" (UID: \"aabf9cf2-dcc2-4a42-b211-e34be354e3ed\") " pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.369494 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.369469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkxq\" (UniqueName: \"kubernetes.io/projected/c0f8acc5-b88a-4adf-b2de-441b178001bf-kube-api-access-xwkxq\") pod \"node-resolver-mvlp4\" (UID: \"c0f8acc5-b88a-4adf-b2de-441b178001bf\") " pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.369751 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.369733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8797\" (UniqueName: \"kubernetes.io/projected/16b9981c-80e1-46aa-90b5-651d968a8850-kube-api-access-b8797\") pod \"multus-flwwz\" (UID: \"16b9981c-80e1-46aa-90b5-651d968a8850\") " pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.370155 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.370138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxgj\" (UniqueName: \"kubernetes.io/projected/f9f0740c-e453-47f4-a40d-564dd1731056-kube-api-access-kzxgj\") pod \"tuned-96845\" (UID: \"f9f0740c-e453-47f4-a40d-564dd1731056\") " pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.371115 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.371097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jgjx\" (UniqueName: \"kubernetes.io/projected/ba7ca911-2481-4fe7-9079-e770b1840406-kube-api-access-5jgjx\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.371627 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.371610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5qr\" (UniqueName: \"kubernetes.io/projected/15789381-f4bd-4597-8239-7c3b98a87e12-kube-api-access-gw5qr\") pod \"iptables-alerter-nsdp5\" (UID: \"15789381-f4bd-4597-8239-7c3b98a87e12\") " pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.374072 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.374054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qfs\" (UniqueName: \"kubernetes.io/projected/ac641325-3949-4503-86d7-77e5ededa110-kube-api-access-p5qfs\") pod \"ovnkube-node-wrxcw\" (UID: \"ac641325-3949-4503-86d7-77e5ededa110\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.374443 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.374429 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:32.374488 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.374473 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:32.374488 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.374483 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.374557 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.374536 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.87452273 +0000 UTC m=+2.057060482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.378357 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.378329 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbzj\" (UniqueName: \"kubernetes.io/projected/da8fe018-bd6c-490e-9197-d5ea9c881a92-kube-api-access-tgbzj\") pod \"node-ca-dl2m4\" (UID: \"da8fe018-bd6c-490e-9197-d5ea9c881a92\") " pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.378491 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.378474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj66\" (UniqueName: \"kubernetes.io/projected/b9885cab-5589-4597-9b6e-6ef20f7bc2b2-kube-api-access-9xj66\") pod \"multus-additional-cni-plugins-dtvll\" (UID: \"b9885cab-5589-4597-9b6e-6ef20f7bc2b2\") " pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.379431 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.379412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5fm\" (UniqueName: \"kubernetes.io/projected/7c9231d4-b789-4d31-9ab1-0514edc9a681-kube-api-access-vl5fm\") pod \"aws-ebs-csi-driver-node-jc5c4\" (UID: \"7c9231d4-b789-4d31-9ab1-0514edc9a681\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.573808 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.573718 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nsdp5" Apr 16 14:52:32.580215 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.580189 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15789381_f4bd_4597_8239_7c3b98a87e12.slice/crio-36cc3fe507a7356860e7184266f8c622b1b44e5d0991f7f87df87cc68fc04e8a WatchSource:0}: Error finding container 36cc3fe507a7356860e7184266f8c622b1b44e5d0991f7f87df87cc68fc04e8a: Status 404 returned error can't find the container with id 36cc3fe507a7356860e7184266f8c622b1b44e5d0991f7f87df87cc68fc04e8a Apr 16 14:52:32.587867 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.587846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mvlp4" Apr 16 14:52:32.593595 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.593576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dtvll" Apr 16 14:52:32.593694 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.593669 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f8acc5_b88a_4adf_b2de_441b178001bf.slice/crio-28fa8be4752afae5df57fce632f3345a416be9a69d7271ce769a1fb55b366f73 WatchSource:0}: Error finding container 28fa8be4752afae5df57fce632f3345a416be9a69d7271ce769a1fb55b366f73: Status 404 returned error can't find the container with id 28fa8be4752afae5df57fce632f3345a416be9a69d7271ce769a1fb55b366f73 Apr 16 14:52:32.600298 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.600275 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9885cab_5589_4597_9b6e_6ef20f7bc2b2.slice/crio-2d778a27c32949ad45dca886ecc40bafa2c690f10ff7d234ab1351aad6063db4 WatchSource:0}: Error finding container 2d778a27c32949ad45dca886ecc40bafa2c690f10ff7d234ab1351aad6063db4: Status 404 returned error can't find the container with id 2d778a27c32949ad45dca886ecc40bafa2c690f10ff7d234ab1351aad6063db4 Apr 16 14:52:32.616483 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.616465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-flwwz" Apr 16 14:52:32.621784 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.621757 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b9981c_80e1_46aa_90b5_651d968a8850.slice/crio-6ee646effa2710b0de63960a72b68a2399304286d83993d4096b9feef475aa8a WatchSource:0}: Error finding container 6ee646effa2710b0de63960a72b68a2399304286d83993d4096b9feef475aa8a: Status 404 returned error can't find the container with id 6ee646effa2710b0de63960a72b68a2399304286d83993d4096b9feef475aa8a Apr 16 14:52:32.633968 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.633951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dl2m4" Apr 16 14:52:32.640117 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.640095 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8fe018_bd6c_490e_9197_d5ea9c881a92.slice/crio-54aedc8b2c1b3946e67756554a24f134a5e9a477e24be50e88b472f407c3ab77 WatchSource:0}: Error finding container 54aedc8b2c1b3946e67756554a24f134a5e9a477e24be50e88b472f407c3ab77: Status 404 returned error can't find the container with id 54aedc8b2c1b3946e67756554a24f134a5e9a477e24be50e88b472f407c3ab77 Apr 16 14:52:32.642836 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.642811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-96845" Apr 16 14:52:32.648350 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.648324 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f0740c_e453_47f4_a40d_564dd1731056.slice/crio-ab4bafad927186c08c79a268194eca59f26d5fc646b21cb031f69d37b5a19642 WatchSource:0}: Error finding container ab4bafad927186c08c79a268194eca59f26d5fc646b21cb031f69d37b5a19642: Status 404 returned error can't find the container with id ab4bafad927186c08c79a268194eca59f26d5fc646b21cb031f69d37b5a19642 Apr 16 14:52:32.648980 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.648962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:32.654493 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.654468 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac641325_3949_4503_86d7_77e5ededa110.slice/crio-163f056c4053fdb23d5db1381d3d01e6ccdd92efd4eff4b626bcc97319cb8ff9 WatchSource:0}: Error finding container 163f056c4053fdb23d5db1381d3d01e6ccdd92efd4eff4b626bcc97319cb8ff9: Status 404 returned error can't find the container with id 163f056c4053fdb23d5db1381d3d01e6ccdd92efd4eff4b626bcc97319cb8ff9 Apr 16 14:52:32.655159 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.655143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:32.660360 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.660340 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" Apr 16 14:52:32.662865 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.662845 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaabf9cf2_dcc2_4a42_b211_e34be354e3ed.slice/crio-ac4c3d79804e3f97c9c6898f29ebc101da99287a1b88affe124da5e888a74d4f WatchSource:0}: Error finding container ac4c3d79804e3f97c9c6898f29ebc101da99287a1b88affe124da5e888a74d4f: Status 404 returned error can't find the container with id ac4c3d79804e3f97c9c6898f29ebc101da99287a1b88affe124da5e888a74d4f Apr 16 14:52:32.667265 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:52:32.667246 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9231d4_b789_4d31_9ab1_0514edc9a681.slice/crio-81bd1ead99e51daa92049f7e03a47e4b511fd4ee03baafaa46ee3eac64a94932 WatchSource:0}: Error finding container 81bd1ead99e51daa92049f7e03a47e4b511fd4ee03baafaa46ee3eac64a94932: Status 404 returned error can't find the container with id 81bd1ead99e51daa92049f7e03a47e4b511fd4ee03baafaa46ee3eac64a94932 Apr 16 14:52:32.720512 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.720479 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:32.870470 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.869894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:32.870470 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.870053 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.870470 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.870118 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.870099947 +0000 UTC m=+3.052637722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.970887 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:32.970829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:32.971079 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.971056 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:32.971144 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.971087 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:32.971144 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.971111 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.971244 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:32.971176 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.971156141 +0000 UTC m=+3.153693912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.298318 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.298222 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:32 +0000 UTC" deadline="2027-12-26 08:32:40.812333212 +0000 UTC" Apr 16 14:52:33.298318 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.298258 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14849h40m7.514079936s" Apr 16 14:52:33.381281 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.381242 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nsdp5" event={"ID":"15789381-f4bd-4597-8239-7c3b98a87e12","Type":"ContainerStarted","Data":"36cc3fe507a7356860e7184266f8c622b1b44e5d0991f7f87df87cc68fc04e8a"} Apr 16 14:52:33.394817 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.394754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8jpq8" event={"ID":"aabf9cf2-dcc2-4a42-b211-e34be354e3ed","Type":"ContainerStarted","Data":"ac4c3d79804e3f97c9c6898f29ebc101da99287a1b88affe124da5e888a74d4f"} Apr 16 14:52:33.398169 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.398052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-96845" event={"ID":"f9f0740c-e453-47f4-a40d-564dd1731056","Type":"ContainerStarted","Data":"ab4bafad927186c08c79a268194eca59f26d5fc646b21cb031f69d37b5a19642"} Apr 16 14:52:33.426816 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.426741 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-flwwz" event={"ID":"16b9981c-80e1-46aa-90b5-651d968a8850","Type":"ContainerStarted","Data":"6ee646effa2710b0de63960a72b68a2399304286d83993d4096b9feef475aa8a"} Apr 16 14:52:33.429100 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.429071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerStarted","Data":"2d778a27c32949ad45dca886ecc40bafa2c690f10ff7d234ab1351aad6063db4"} Apr 16 14:52:33.454865 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.454810 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" event={"ID":"7c9231d4-b789-4d31-9ab1-0514edc9a681","Type":"ContainerStarted","Data":"81bd1ead99e51daa92049f7e03a47e4b511fd4ee03baafaa46ee3eac64a94932"} Apr 16 14:52:33.471081 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.471002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"163f056c4053fdb23d5db1381d3d01e6ccdd92efd4eff4b626bcc97319cb8ff9"} Apr 16 14:52:33.482856 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.482791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dl2m4" event={"ID":"da8fe018-bd6c-490e-9197-d5ea9c881a92","Type":"ContainerStarted","Data":"54aedc8b2c1b3946e67756554a24f134a5e9a477e24be50e88b472f407c3ab77"} Apr 16 14:52:33.492382 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.492331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mvlp4" event={"ID":"c0f8acc5-b88a-4adf-b2de-441b178001bf","Type":"ContainerStarted","Data":"28fa8be4752afae5df57fce632f3345a416be9a69d7271ce769a1fb55b366f73"} Apr 16 14:52:33.542507 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.542471 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:33.886682 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.886639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:33.886859 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.886841 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.887021 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.886923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.886889311 +0000 UTC m=+5.069427068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.988278 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:33.987628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:33.988278 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.987793 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:33.988278 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.987811 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:33.988278 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.987825 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.988278 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:33.987883 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.987864499 +0000 UTC m=+5.170402252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:34.299107 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:34.299011 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:32 +0000 UTC" deadline="2028-02-02 02:30:25.790070076 +0000 UTC" Apr 16 14:52:34.299107 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:34.299055 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15755h37m51.491018565s" Apr 16 14:52:34.350564 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:34.350531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:34.350725 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:34.350677 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:34.351120 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:34.351101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:34.351209 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:34.351190 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:35.907487 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:35.907440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:35.908080 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:35.907593 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:35.908080 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:35.907667 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.907646516 +0000 UTC m=+9.090184271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:36.008154 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:36.008092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:36.008355 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.008242 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:36.008355 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.008274 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:36.008355 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.008288 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:36.008526 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.008358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:40.008337589 +0000 UTC m=+9.190875344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:36.350621 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:36.350473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:36.350761 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.350617 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:36.350826 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:36.350765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:36.350946 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:36.350891 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:38.350999 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:38.350724 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:38.350999 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:38.350841 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:38.350999 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:38.350849 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:38.350999 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:38.350946 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:39.943262 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:39.943223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:39.943720 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:39.943368 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:39.943720 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:39.943442 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.94342222 +0000 UTC m=+17.125959982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:40.044335 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:40.044296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:40.044509 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.044453 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:40.044509 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.044471 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:40.044509 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.044484 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:40.044677 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.044542 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.044523071 +0000 UTC m=+17.227060827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:40.350517 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:40.350431 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:40.350517 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:40.350467 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:40.350739 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.350567 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:40.350739 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:40.350675 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:42.156049 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.156012 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m9n7j"] Apr 16 14:52:42.160421 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.160398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.160548 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.160480 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:42.259739 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.259702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-dbus\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.259895 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.259759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-kubelet-config\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.259895 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.259784 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.349893 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.349844 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:42.350077 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.349849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:42.350077 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.349987 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:42.350077 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.350047 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:42.360048 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.360026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-kubelet-config\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.360048 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.360053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.360196 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.360094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-dbus\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.360196 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.360166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-kubelet-config\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.360270 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.360192 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:42.360270 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.360255 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.860235918 +0000 UTC m=+12.042773684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:42.360270 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.360263 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aee05764-3834-4ece-aff3-6c244c99378a-dbus\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.863318 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:42.863239 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:42.863453 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.863361 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:42.863453 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:42.863418 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.863400582 +0000 UTC m=+13.045938346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:43.869109 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:43.869075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:43.869527 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:43.869213 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:43.869527 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:43.869287 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.869272057 +0000 UTC m=+15.051809808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:44.350251 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:44.349991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:44.350396 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:44.349993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:44.350396 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:44.350304 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:44.350507 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:44.350392 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:44.350507 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:44.349996 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:44.350507 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:44.350490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:45.884245 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:45.884204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:45.884593 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:45.884344 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:45.884593 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:45.884427 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.884412374 +0000 UTC m=+19.066950125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:46.350705 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:46.350662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:46.350889 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:46.350663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:46.350889 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:46.350763 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:46.350889 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:46.350870 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:46.351067 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:46.350918 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:46.351067 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:46.350991 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:47.999343 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:47.999300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:47.999800 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:47.999487 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.999800 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:47.999568 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.999547932 +0000 UTC m=+33.182085690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.100792 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:48.100744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:48.100990 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.100953 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:48.100990 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.100979 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:48.101089 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.100993 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.101089 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.101057 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.101037756 +0000 UTC m=+33.283575524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.350622 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:48.350543 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:48.350815 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:48.350543 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:48.350815 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.350659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:48.350815 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:48.350670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:48.350815 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.350730 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:48.351029 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:48.350872 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:49.912114 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:49.912072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:49.912647 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:49.912214 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:49.912647 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:49.912300 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:57.912278192 +0000 UTC m=+27.094815947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:50.350272 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:50.350247 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:50.350396 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:50.350293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:50.350396 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:50.350293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:50.350396 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:50.350360 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:50.350542 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:50.350429 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:50.350542 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:50.350477 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:51.529780 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.529574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" event={"ID":"7c9231d4-b789-4d31-9ab1-0514edc9a681","Type":"ContainerStarted","Data":"9f3326be21ee86850f76fff16da109ef7159c62ff0facfd7887c501d4b6745e6"} Apr 16 14:52:51.532055 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532027 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"e7c6558aee0b673667a6b69fa5ce70ee2f74f5ab43a4128bccd5bd32c2a7901e"} Apr 16 14:52:51.532055 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"dc92605f94a3bd481fbdd647535825a2ead0e3a14ca497892612c0f731d27f4b"} Apr 16 14:52:51.532248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532070 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"30a956736f63390e9ebd306ada4261e195cd7c2cda2f070302a7ab900616b8f3"} Apr 16 14:52:51.532248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532083 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"4ab6ad37f55aefc7d0b3dae11b82324bc72991f9f093f3959f29741066c5a407"} Apr 16 14:52:51.532248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532095 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"4510bdc3b171f0ff54dfdaa641083e61b1911fa4c9409594fb005ca02a06f668"} Apr 16 14:52:51.532248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.532110 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"49028333a11d42c12d003aef98384c55a3e5c0ff2b60af3a6da33a8c79720dfb"} Apr 16 14:52:51.533180 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.533156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dl2m4" event={"ID":"da8fe018-bd6c-490e-9197-d5ea9c881a92","Type":"ContainerStarted","Data":"ad7073f5e047e5acec7f6382058a8123b07d04939850d23c06569c3e1b22e719"} Apr 16 14:52:51.534329 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.534301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mvlp4" event={"ID":"c0f8acc5-b88a-4adf-b2de-441b178001bf","Type":"ContainerStarted","Data":"1ed4a33c7eeb34c9ece41652b090d8d67fd7fc7caf3340feac145cdb9c1e58a9"} Apr 16 14:52:51.535447 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.535419 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" event={"ID":"da3e9a6a335603c20524d6311693f61c","Type":"ContainerStarted","Data":"6de9df7dbe89069d98261a7977a2607c036dfba5a4d7de21eba32d0253667c8f"} Apr 16 14:52:51.536694 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.536672 2579 generic.go:358] "Generic (PLEG): container finished" podID="0c5598820f1d3314f2d1a9302b66f570" containerID="1cbc992687f98e37f6c7646f54ff7a07948473054b494daea980b0f605cc6576" exitCode=0 Apr 16 14:52:51.536797 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.536728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" event={"ID":"0c5598820f1d3314f2d1a9302b66f570","Type":"ContainerDied","Data":"1cbc992687f98e37f6c7646f54ff7a07948473054b494daea980b0f605cc6576"} Apr 16 14:52:51.539927 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.539856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8jpq8" event={"ID":"aabf9cf2-dcc2-4a42-b211-e34be354e3ed","Type":"ContainerStarted","Data":"21de504a940172cba2347fad2ec1aafd380c0ad5ee4f9e060af573c0a0dcccec"} Apr 16 14:52:51.541450 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.541417 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-96845" event={"ID":"f9f0740c-e453-47f4-a40d-564dd1731056","Type":"ContainerStarted","Data":"faebca6c6286d7affebafb26f1564cb64b1e259449dcb09e228acfc505259e8d"} Apr 16 14:52:51.542655 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.542636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-flwwz" event={"ID":"16b9981c-80e1-46aa-90b5-651d968a8850","Type":"ContainerStarted","Data":"95652598417dd0fafc46ca7be5ab43ad492fec0b756e5a495282f45b5cf4679a"} Apr 16 14:52:51.543889 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.543867 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="cfb489503e9699118c6c5032921a522bf4546ae2a1d85e5b31211c48579aaeb8" exitCode=0 Apr 16 14:52:51.544001 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.543896 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"cfb489503e9699118c6c5032921a522bf4546ae2a1d85e5b31211c48579aaeb8"} Apr 16 14:52:51.551923 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.551856 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dl2m4" podStartSLOduration=2.8363792979999998 podStartE2EDuration="20.551845989s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.641483201 +0000 UTC m=+1.824020954" lastFinishedPulling="2026-04-16 14:52:50.356949886 +0000 UTC m=+19.539487645" observedRunningTime="2026-04-16 14:52:51.55150144 +0000 UTC m=+20.734039241" watchObservedRunningTime="2026-04-16 14:52:51.551845989 +0000 UTC m=+20.734383761" Apr 16 14:52:51.566566 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.566300 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mvlp4" podStartSLOduration=2.834312151 podStartE2EDuration="20.566279546s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.595081996 +0000 UTC m=+1.777619748" lastFinishedPulling="2026-04-16 14:52:50.327049375 +0000 UTC m=+19.509587143" observedRunningTime="2026-04-16 14:52:51.565598232 +0000 UTC m=+20.748136007" watchObservedRunningTime="2026-04-16 14:52:51.566279546 +0000 UTC m=+20.748817322" Apr 16 14:52:51.583365 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.583296 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-flwwz" podStartSLOduration=2.763079149 podStartE2EDuration="20.583275391s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.623233638 +0000 UTC m=+1.805771392" lastFinishedPulling="2026-04-16 14:52:50.443429867 +0000 UTC m=+19.625967634" observedRunningTime="2026-04-16 14:52:51.581757935 +0000 UTC m=+20.764295725" watchObservedRunningTime="2026-04-16 14:52:51.583275391 +0000 UTC m=+20.765813169" Apr 16 14:52:51.598291 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.598234 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-47.ec2.internal" podStartSLOduration=19.598214682 podStartE2EDuration="19.598214682s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:51.597593143 +0000 UTC m=+20.780130918" watchObservedRunningTime="2026-04-16 14:52:51.598214682 +0000 UTC m=+20.780752457" Apr 16 14:52:51.645576 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.645530 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8jpq8" podStartSLOduration=2.983044186 podStartE2EDuration="20.645514923s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.664582536 +0000 UTC m=+1.847120288" lastFinishedPulling="2026-04-16 14:52:50.327053257 +0000 UTC m=+19.509591025" observedRunningTime="2026-04-16 14:52:51.645237757 +0000 UTC m=+20.827775531" watchObservedRunningTime="2026-04-16 14:52:51.645514923 +0000 UTC m=+20.828052732" Apr 16 14:52:51.645741 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:51.645666 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-96845" podStartSLOduration=2.850459444 podStartE2EDuration="20.64566074s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.649805722 +0000 UTC m=+1.832343476" lastFinishedPulling="2026-04-16 14:52:50.445007009 +0000 UTC m=+19.627544772" observedRunningTime="2026-04-16 14:52:51.63018896 +0000 UTC m=+20.812726735" watchObservedRunningTime="2026-04-16 14:52:51.64566074 +0000 UTC m=+20.828198514" Apr 16 14:52:52.130721 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.130693 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:52.131444 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.131424 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:52.137209 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.137185 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:52.331818 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.331630 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:52.137205516Z","UUID":"2488341b-1430-4f85-ab3e-9f30a41fa17d","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:52.335936 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.334054 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:52.335936 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.334092 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:52.350136 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.350110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:52.350280 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.350110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:52.350354 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:52.350331 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:52.350354 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.350112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:52.350457 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:52.350208 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:52.350510 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:52.350478 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:52.499525 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.499469 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:52.500340 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.500255 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8jpq8" Apr 16 14:52:52.548859 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.548822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" event={"ID":"7c9231d4-b789-4d31-9ab1-0514edc9a681","Type":"ContainerStarted","Data":"46a1e893d5252ae2ca7f0e4da3252224cad1991b11a7fcf406773e6a4a6e2b7a"} Apr 16 14:52:52.550729 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.550671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nsdp5" event={"ID":"15789381-f4bd-4597-8239-7c3b98a87e12","Type":"ContainerStarted","Data":"53eff993d6b63096c9f231073663d67930953c075aa746feca87dc5bbdcccf90"} Apr 16 14:52:52.553132 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.553102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" event={"ID":"0c5598820f1d3314f2d1a9302b66f570","Type":"ContainerStarted","Data":"977a2c057022b6439b8e7987aacb24bc7d9a19156e43fbb377168164cb7f6c2b"} Apr 16 14:52:52.568698 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:52.568657 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nsdp5" podStartSLOduration=3.811721056 podStartE2EDuration="21.568639198s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.581757166 +0000 UTC m=+1.764294921" lastFinishedPulling="2026-04-16 14:52:50.338675305 +0000 UTC m=+19.521213063" observedRunningTime="2026-04-16 14:52:52.568077213 +0000 UTC m=+21.750614993" watchObservedRunningTime="2026-04-16 14:52:52.568639198 +0000 UTC m=+21.751176974" Apr 16 14:52:53.557218 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:53.557113 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" event={"ID":"7c9231d4-b789-4d31-9ab1-0514edc9a681","Type":"ContainerStarted","Data":"681f1cc3413740abc0829b2b4aa02cb1b6186eef7813c6e99cf9387b8096b590"} Apr 16 14:52:53.560341 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:53.560298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"1a6740eb1bc8ede052779d756a705bbf0078524f0ea2bf6ff16f77536053d496"} Apr 16 14:52:53.575283 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:53.575232 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-47.ec2.internal" podStartSLOduration=21.575214593 podStartE2EDuration="21.575214593s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:52.586063036 +0000 UTC m=+21.768600810" watchObservedRunningTime="2026-04-16 14:52:53.575214593 +0000 UTC m=+22.757752369" Apr 16 14:52:54.350600 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:54.350564 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:54.350759 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:54.350691 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:54.350759 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:54.350699 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:54.350889 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:54.350817 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:54.350889 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:54.350867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:54.351016 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:54.350962 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:56.350248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.350074 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:56.350869 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.350089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:56.350869 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:56.350319 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:56.350869 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.350108 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:56.350869 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:56.350387 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:56.350869 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:56.350453 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:56.567328 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.567291 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="d1ff9783b6aa865735c271c013b5bd4b0f83bf51bb82db24478a3dc860409de6" exitCode=0 Apr 16 14:52:56.567486 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.567375 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"d1ff9783b6aa865735c271c013b5bd4b0f83bf51bb82db24478a3dc860409de6"} Apr 16 14:52:56.570697 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.570676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" event={"ID":"ac641325-3949-4503-86d7-77e5ededa110","Type":"ContainerStarted","Data":"e9586b43b4d2f2cf99555e20bcfdf5ef31ba84dde269846482a1342b31904b50"} Apr 16 14:52:56.570987 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.570971 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:56.571058 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.570996 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:56.571058 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.571009 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:56.585785 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.585762 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:56.586034 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.586016 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:52:56.588620 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.588582 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jc5c4" podStartSLOduration=5.363263587 podStartE2EDuration="25.588570578s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.668744487 +0000 UTC m=+1.851282239" lastFinishedPulling="2026-04-16 14:52:52.894051463 +0000 UTC m=+22.076589230" observedRunningTime="2026-04-16 14:52:53.57508712 +0000 UTC m=+22.757624895" watchObservedRunningTime="2026-04-16 14:52:56.588570578 +0000 UTC m=+25.771108352" Apr 16 14:52:56.611656 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:56.611562 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" podStartSLOduration=7.629213722 podStartE2EDuration="25.611546437s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.656205496 +0000 UTC m=+1.838743260" lastFinishedPulling="2026-04-16 14:52:50.638538217 +0000 UTC m=+19.821075975" observedRunningTime="2026-04-16 14:52:56.611415348 +0000 UTC m=+25.793953158" watchObservedRunningTime="2026-04-16 14:52:56.611546437 +0000 UTC m=+25.794084211" Apr 16 14:52:57.504204 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.504107 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qh75b"] Apr 16 14:52:57.504752 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.504251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:57.504752 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:57.504368 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:57.507004 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.506978 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vpmk7"] Apr 16 14:52:57.507104 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.507091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:57.507194 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:57.507175 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:57.509816 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.509792 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m9n7j"] Apr 16 14:52:57.509913 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.509891 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:57.509992 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:57.509976 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:52:57.574309 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.574278 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="afe092d1265053b6a52def3121a5ad3ce2191d98c6dad8a5d3872c193ba1230a" exitCode=0 Apr 16 14:52:57.574452 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.574346 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"afe092d1265053b6a52def3121a5ad3ce2191d98c6dad8a5d3872c193ba1230a"} Apr 16 14:52:57.976440 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:57.976311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:57.976440 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:57.976432 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:57.976602 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:57.976484 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret podName:aee05764-3834-4ece-aff3-6c244c99378a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.976470998 +0000 UTC m=+43.159008750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret") pod "global-pull-secret-syncer-m9n7j" (UID: "aee05764-3834-4ece-aff3-6c244c99378a") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:58.580740 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:58.580655 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="c18cd148e72be90c4c02a82c77c57f74cf7d8f620994e7e98cc1fc705dba8d93" exitCode=0 Apr 16 14:52:58.581105 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:58.580731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"c18cd148e72be90c4c02a82c77c57f74cf7d8f620994e7e98cc1fc705dba8d93"} Apr 16 14:52:59.350686 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:59.350651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:52:59.350686 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:59.350686 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:52:59.350925 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:52:59.350662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:52:59.350925 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:59.350800 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:52:59.350925 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:59.350852 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:52:59.351087 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:52:59.350951 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:53:01.351546 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:01.351515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:53:01.352011 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:01.351634 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:53:01.352011 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:01.351711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:01.352011 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:01.351837 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:53:01.352011 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:01.351874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:01.352011 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:01.351960 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:53:03.349860 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.349822 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:03.350344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.349829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:53:03.350344 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.349973 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qh75b" podUID="1609e3b0-3601-4e3b-bb3c-b89f06a20319" Apr 16 14:53:03.350344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.349829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:03.350344 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.350016 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:53:03.350344 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.350114 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m9n7j" podUID="aee05764-3834-4ece-aff3-6c244c99378a" Apr 16 14:53:03.620922 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.620875 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-47.ec2.internal" event="NodeReady" Apr 16 14:53:03.621091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.621072 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:03.657304 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.657235 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9f76899d-gd279"] Apr 16 14:53:03.694458 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.694414 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s"] Apr 16 14:53:03.694630 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.694563 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.697289 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.697255 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:53:03.697471 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.697255 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:53:03.697636 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.697611 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q9vr8\"" Apr 16 14:53:03.697811 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.697797 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:53:03.703742 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.703715 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:53:03.712867 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.712840 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s"] Apr 16 14:53:03.713029 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.712878 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9f76899d-gd279"] Apr 16 14:53:03.713029 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.712896 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt"] Apr 16 14:53:03.713135 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.713046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.715735 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.715710 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:53:03.715977 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.715959 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xpsp2\"" Apr 16 14:53:03.716069 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.715958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:53:03.731213 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.731186 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kd7rp"] Apr 16 14:53:03.731334 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.731298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.734394 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.734345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:53:03.734571 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.734557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:53:03.734653 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.734589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fg9dh\"" Apr 16 14:53:03.734828 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.734766 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:53:03.734828 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.734826 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:53:03.750751 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.750714 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t"] Apr 16 14:53:03.750974 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.750956 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:03.753525 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.753497 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:03.753525 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.753511 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:03.753703 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.753510 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:03.753703 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.753569 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:53:03.772892 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.772856 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc"] Apr 16 14:53:03.773079 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.773027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.775582 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.775559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:53:03.776051 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.776029 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:53:03.776051 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.776049 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:53:03.776204 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.776092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:53:03.797604 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.797574 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kd7rp"] Apr 16 14:53:03.797604 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.797605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt"] Apr 16 14:53:03.797846 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.797618 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc"] Apr 16 14:53:03.797846 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.797629 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t"] Apr 16 14:53:03.797846 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.797742 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hfndr"] Apr 16 14:53:03.798152 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.798129 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:03.800706 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.800682 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:53:03.813344 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.813316 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hfndr"] Apr 16 14:53:03.813506 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.813487 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:03.816212 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.816090 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:53:03.816212 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.816109 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:03.816212 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.816128 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:03.817682 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cabd1204-eddf-42ed-9bad-9f118298b94b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.817791 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkfx\" (UniqueName: \"kubernetes.io/projected/cabd1204-eddf-42ed-9bad-9f118298b94b-kube-api-access-xgkfx\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.817791 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.817914 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.817963 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.818004 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.817983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.818043 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.818080 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.818163 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.818258 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.818258 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818214 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.818258 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.818240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4fv7\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.919183 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsbhc\" (UniqueName: \"kubernetes.io/projected/59a9063e-512b-4e8e-9292-b64616eb9a33-kube-api-access-vsbhc\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.919183 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919179 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.919350 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnnw\" (UniqueName: \"kubernetes.io/projected/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-kube-api-access-nsnnw\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:03.919425 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.919421 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.41940299 +0000 UTC m=+33.601940749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-config-volume\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cabd1204-eddf-42ed-9bad-9f118298b94b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlznx\" (UniqueName: \"kubernetes.io/projected/29711f9c-de92-459f-9949-54d43d314175-kube-api-access-xlznx\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgkfx\" (UniqueName: \"kubernetes.io/projected/cabd1204-eddf-42ed-9bad-9f118298b94b-kube-api-access-xgkfx\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.919738 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.919699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.919777 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.919790 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:03.919829 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.419813824 +0000 UTC m=+33.602351591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920001 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920112 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.920217 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29711f9c-de92-459f-9949-54d43d314175-tmp\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-tmp-dir\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4fv7\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/59a9063e-512b-4e8e-9292-b64616eb9a33-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/29711f9c-de92-459f-9949-54d43d314175-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:03.920528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjkc\" (UniqueName: \"kubernetes.io/projected/91fb67d9-4995-4dfe-bc80-067575a9d732-kube-api-access-2fjkc\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:03.920884 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.920884 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:03.920884 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.920833 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.924972 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.924944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.925126 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.925049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cabd1204-eddf-42ed-9bad-9f118298b94b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.928887 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.928859 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.930948 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.930923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgkfx\" (UniqueName: \"kubernetes.io/projected/cabd1204-eddf-42ed-9bad-9f118298b94b-kube-api-access-xgkfx\") pod \"managed-serviceaccount-addon-agent-77fc8586d5-vtrgt\" (UID: \"cabd1204-eddf-42ed-9bad-9f118298b94b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:03.931062 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.931019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.931129 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.931115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4fv7\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:03.934196 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:03.934174 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:04.021386 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-config-volume\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.021386 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlznx\" (UniqueName: \"kubernetes.io/projected/29711f9c-de92-459f-9949-54d43d314175-kube-api-access-xlznx\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.021648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29711f9c-de92-459f-9949-54d43d314175-tmp\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.021648 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-tmp-dir\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.021740 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.021740 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/59a9063e-512b-4e8e-9292-b64616eb9a33-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.021740 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/29711f9c-de92-459f-9949-54d43d314175-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.021830 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjkc\" (UniqueName: \"kubernetes.io/projected/91fb67d9-4995-4dfe-bc80-067575a9d732-kube-api-access-2fjkc\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:04.021830 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.021830 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-tmp-dir\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.021961 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29711f9c-de92-459f-9949-54d43d314175-tmp\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.021961 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.021961 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsbhc\" (UniqueName: \"kubernetes.io/projected/59a9063e-512b-4e8e-9292-b64616eb9a33-kube-api-access-vsbhc\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.022071 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.021988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:53:04.022071 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.022169 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnnw\" (UniqueName: \"kubernetes.io/projected/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-kube-api-access-nsnnw\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.022169 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.022169 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-config-volume\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.022316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:04.022316 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022245 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.022316 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022313 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.522294468 +0000 UTC m=+33.704832230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:04.022464 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022364 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:04.022464 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022413 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:36.022397912 +0000 UTC m=+65.204935669 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:04.022464 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.022426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/59a9063e-512b-4e8e-9292-b64616eb9a33-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.022602 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022477 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.022602 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.022508 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.522498699 +0000 UTC m=+33.705036453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.025133 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.025103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-ca\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.025133 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.025123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/29711f9c-de92-459f-9949-54d43d314175-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.025291 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.025240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.025362 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.025341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.025417 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.025343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a9063e-512b-4e8e-9292-b64616eb9a33-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.034699 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.034643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnnw\" (UniqueName: \"kubernetes.io/projected/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-kube-api-access-nsnnw\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.035473 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.035423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlznx\" (UniqueName: \"kubernetes.io/projected/29711f9c-de92-459f-9949-54d43d314175-kube-api-access-xlznx\") pod \"klusterlet-addon-workmgr-6fd7f98bbb-69vqc\" (UID: \"29711f9c-de92-459f-9949-54d43d314175\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.036153 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.036132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsbhc\" (UniqueName: \"kubernetes.io/projected/59a9063e-512b-4e8e-9292-b64616eb9a33-kube-api-access-vsbhc\") pod \"cluster-proxy-proxy-agent-6b7b574cf-sbk4t\" (UID: \"59a9063e-512b-4e8e-9292-b64616eb9a33\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.036552 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.036507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjkc\" (UniqueName: \"kubernetes.io/projected/91fb67d9-4995-4dfe-bc80-067575a9d732-kube-api-access-2fjkc\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:04.051821 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.051781 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" Apr 16 14:53:04.083875 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.083837 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:53:04.117248 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.117216 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:04.123147 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.123126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:04.123260 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.123248 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:53:04.123304 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.123266 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:53:04.123304 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.123275 2579 projected.go:194] Error preparing data for projected volume kube-api-access-crm5b for pod openshift-network-diagnostics/network-check-target-qh75b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:04.123374 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.123327 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b podName:1609e3b0-3601-4e3b-bb3c-b89f06a20319 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:36.123314705 +0000 UTC m=+65.305852457 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-crm5b" (UniqueName: "kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b") pod "network-check-target-qh75b" (UID: "1609e3b0-3601-4e3b-bb3c-b89f06a20319") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:04.309132 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.309101 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t"] Apr 16 14:53:04.311925 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.311885 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc"] Apr 16 14:53:04.312684 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.312663 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt"] Apr 16 14:53:04.425984 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.425939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.426034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.426100 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.426156 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.426170 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.426173 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.426153796 +0000 UTC m=+34.608691563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:04.426511 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.426280 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.426264172 +0000 UTC m=+34.608801945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:04.435102 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:53:04.435067 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a9063e_512b_4e8e_9292_b64616eb9a33.slice/crio-b6a9848fa55752db64a66bf09142cdb066856a762fe73d56f329b3eda69c2d4f WatchSource:0}: Error finding container b6a9848fa55752db64a66bf09142cdb066856a762fe73d56f329b3eda69c2d4f: Status 404 returned error can't find the container with id b6a9848fa55752db64a66bf09142cdb066856a762fe73d56f329b3eda69c2d4f Apr 16 14:53:04.435341 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:53:04.435322 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29711f9c_de92_459f_9949_54d43d314175.slice/crio-80a90517239ddadbe9913cdab96c1c41dcc2627c509e4fec473588da1f5a62e9 WatchSource:0}: Error finding container 80a90517239ddadbe9913cdab96c1c41dcc2627c509e4fec473588da1f5a62e9: Status 404 returned error can't find the container with id 80a90517239ddadbe9913cdab96c1c41dcc2627c509e4fec473588da1f5a62e9 Apr 16 14:53:04.436001 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:53:04.435955 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcabd1204_eddf_42ed_9bad_9f118298b94b.slice/crio-8b9e335126d6a60405262eb11ba1a86ee117009a811669055745523a665872ba WatchSource:0}: Error finding container 8b9e335126d6a60405262eb11ba1a86ee117009a811669055745523a665872ba: Status 404 returned error can't find the container with id 8b9e335126d6a60405262eb11ba1a86ee117009a811669055745523a665872ba Apr 16 14:53:04.527386 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.527355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:04.527386 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.527392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:04.527593 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.527519 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.527593 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.527521 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.527593 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.527587 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.527566008 +0000 UTC m=+34.710103782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:04.527710 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:04.527606 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.527597114 +0000 UTC m=+34.710134866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.592934 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.592880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" event={"ID":"29711f9c-de92-459f-9949-54d43d314175","Type":"ContainerStarted","Data":"80a90517239ddadbe9913cdab96c1c41dcc2627c509e4fec473588da1f5a62e9"} Apr 16 14:53:04.593728 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.593705 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" event={"ID":"cabd1204-eddf-42ed-9bad-9f118298b94b","Type":"ContainerStarted","Data":"8b9e335126d6a60405262eb11ba1a86ee117009a811669055745523a665872ba"} Apr 16 14:53:04.594487 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:04.594471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerStarted","Data":"b6a9848fa55752db64a66bf09142cdb066856a762fe73d56f329b3eda69c2d4f"} Apr 16 14:53:05.353371 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.353336 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:53:05.353797 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.353780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:05.354522 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.354498 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:05.357939 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.357915 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:05.358200 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.358184 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:53:05.359424 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.359399 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:05.359630 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.359614 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z9txl\"" Apr 16 14:53:05.359824 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.359792 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:53:05.360032 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.360008 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:05.437372 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.437337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.437441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.437519 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.437559 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.437573 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.437597 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.437571798 +0000 UTC m=+36.620109564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:05.438216 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.437621 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.437605486 +0000 UTC m=+36.620143241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:05.538604 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.538559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:05.538604 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.538607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:05.538942 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.538748 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:05.538942 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.538807 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.53878915 +0000 UTC m=+36.721326906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:05.539520 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.539499 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:05.539599 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:05.539552 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.539537332 +0000 UTC m=+36.722075089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:05.610511 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.608940 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="4fc0aa0e44bbc11a296f4044a46371e5208db6f700ed5ddc90f096d96865721c" exitCode=0 Apr 16 14:53:05.610511 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:05.609003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"4fc0aa0e44bbc11a296f4044a46371e5208db6f700ed5ddc90f096d96865721c"} Apr 16 14:53:06.616125 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:06.616083 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9885cab-5589-4597-9b6e-6ef20f7bc2b2" containerID="67ea1769ae0514d9ebaf0f620bb32e3edac4658d5057fb39ead8b80120b73207" exitCode=0 Apr 16 14:53:06.616649 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:06.616134 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerDied","Data":"67ea1769ae0514d9ebaf0f620bb32e3edac4658d5057fb39ead8b80120b73207"} Apr 16 14:53:07.457834 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:07.457778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:07.458073 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:07.457936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:07.458073 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.457977 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:07.458073 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.458063 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.458040923 +0000 UTC m=+40.640578699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:07.458254 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.458133 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:07.458254 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.458148 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:07.458254 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.458204 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.458185287 +0000 UTC m=+40.640723064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:07.558710 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:07.558662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:07.558710 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:07.558715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:07.558983 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.558826 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:07.558983 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.558895 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:07.558983 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.558923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.558886089 +0000 UTC m=+40.741423843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:07.558983 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:07.558979 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.558960992 +0000 UTC m=+40.741498762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:11.491985 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.491755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.491955 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.492054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.492127 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.492109052 +0000 UTC m=+48.674646808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.492188 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.492203 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:11.492438 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.492260 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.492242684 +0000 UTC m=+48.674780438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:11.592939 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.592888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:11.593140 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.592955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:11.593140 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.593034 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:11.593140 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.593074 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:11.593140 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.593115 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.593092755 +0000 UTC m=+48.775630529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:11.593140 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:11.593134 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.593125072 +0000 UTC m=+48.775662829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:11.628782 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.628746 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerStarted","Data":"f584988c6d7230e465467f22c70ef9c6e67186db7b73abe22821d3280ce70d61"} Apr 16 14:53:11.632050 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.632004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dtvll" event={"ID":"b9885cab-5589-4597-9b6e-6ef20f7bc2b2","Type":"ContainerStarted","Data":"6dff08cb43360abfeb81ea20ae26665ca4bd5ea67c414380ce77ef2e0c54a161"} Apr 16 14:53:11.633637 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.633608 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" event={"ID":"29711f9c-de92-459f-9949-54d43d314175","Type":"ContainerStarted","Data":"04c4342665c366459e04bb62fd9375f1e5af8a125b4789efedb97287ab8e27ef"} Apr 16 14:53:11.633834 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.633805 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:11.635063 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.635023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" event={"ID":"cabd1204-eddf-42ed-9bad-9f118298b94b","Type":"ContainerStarted","Data":"0b9ea46515a38cf8108003f29db3450fe9b3a0b2b63481afb2e560dbc597fd60"} Apr 16 14:53:11.635763 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.635743 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:53:11.657003 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.656955 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dtvll" podStartSLOduration=8.778778587 podStartE2EDuration="40.656938399s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.601716006 +0000 UTC m=+1.784253757" lastFinishedPulling="2026-04-16 14:53:04.479875797 +0000 UTC m=+33.662413569" observedRunningTime="2026-04-16 14:53:11.655778431 +0000 UTC m=+40.838316205" watchObservedRunningTime="2026-04-16 14:53:11.656938399 +0000 UTC m=+40.839476172" Apr 16 14:53:11.671834 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.671771 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" podStartSLOduration=6.564893215 podStartE2EDuration="12.671755662s" podCreationTimestamp="2026-04-16 14:52:59 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.457106946 +0000 UTC m=+33.639644761" lastFinishedPulling="2026-04-16 14:53:10.563969452 +0000 UTC m=+39.746507208" observedRunningTime="2026-04-16 14:53:11.671004694 +0000 UTC m=+40.853542469" watchObservedRunningTime="2026-04-16 14:53:11.671755662 +0000 UTC m=+40.854293436" Apr 16 14:53:11.687784 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:11.687725 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" podStartSLOduration=6.575394856 podStartE2EDuration="12.687705497s" podCreationTimestamp="2026-04-16 14:52:59 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.457081241 +0000 UTC m=+33.639618993" lastFinishedPulling="2026-04-16 14:53:10.569391866 +0000 UTC m=+39.751929634" observedRunningTime="2026-04-16 14:53:11.687153227 +0000 UTC m=+40.869691014" watchObservedRunningTime="2026-04-16 14:53:11.687705497 +0000 UTC m=+40.870243272" Apr 16 14:53:13.640262 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:13.640219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerStarted","Data":"19f427fe79e3ddd04f2c8d6e1502d5058e17ded6839a52f772bcf44996b23ed8"} Apr 16 14:53:13.640613 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:13.640267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerStarted","Data":"1888d23f1693a45ea509067d1b2ec2be56a8a4098bb68c687e3020e58923be59"} Apr 16 14:53:13.666012 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:13.665966 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" podStartSLOduration=6.205226863 podStartE2EDuration="14.665952535s" podCreationTimestamp="2026-04-16 14:52:59 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.456954328 +0000 UTC m=+33.639492082" lastFinishedPulling="2026-04-16 14:53:12.917679999 +0000 UTC m=+42.100217754" observedRunningTime="2026-04-16 14:53:13.664281497 +0000 UTC m=+42.846819272" watchObservedRunningTime="2026-04-16 14:53:13.665952535 +0000 UTC m=+42.848490309" Apr 16 14:53:14.012553 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:14.012523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:14.015929 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:14.015894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aee05764-3834-4ece-aff3-6c244c99378a-original-pull-secret\") pod \"global-pull-secret-syncer-m9n7j\" (UID: \"aee05764-3834-4ece-aff3-6c244c99378a\") " pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:14.094402 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:14.094364 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9n7j" Apr 16 14:53:14.207241 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:14.207210 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m9n7j"] Apr 16 14:53:14.211083 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:53:14.211053 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee05764_3834_4ece_aff3_6c244c99378a.slice/crio-5a942ebf642049cf61b58ad40ea96f479ad26258c63e58ceb8f0eb7d0b05cc5b WatchSource:0}: Error finding container 5a942ebf642049cf61b58ad40ea96f479ad26258c63e58ceb8f0eb7d0b05cc5b: Status 404 returned error can't find the container with id 5a942ebf642049cf61b58ad40ea96f479ad26258c63e58ceb8f0eb7d0b05cc5b Apr 16 14:53:14.643565 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:14.643527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m9n7j" event={"ID":"aee05764-3834-4ece-aff3-6c244c99378a","Type":"ContainerStarted","Data":"5a942ebf642049cf61b58ad40ea96f479ad26258c63e58ceb8f0eb7d0b05cc5b"} Apr 16 14:53:18.654541 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:18.654506 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m9n7j" event={"ID":"aee05764-3834-4ece-aff3-6c244c99378a","Type":"ContainerStarted","Data":"dceabeda122014b6080ad7572076adc535aebf4e9da53e1b0899920a8d7a1b25"} Apr 16 14:53:19.551102 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:19.551060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:19.551279 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:19.551138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:19.551279 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.551195 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:19.551279 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.551213 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:19.551279 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.551246 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:19.551279 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.551268 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.551251978 +0000 UTC m=+64.733789730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:19.551440 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.551289 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.551277811 +0000 UTC m=+64.733815562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:19.652041 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:19.652009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:19.652041 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:19.652045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:19.652219 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.652154 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:19.652219 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.652198 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:19.652284 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.652223 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.652201728 +0000 UTC m=+64.834739494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:19.652321 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:19.652315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.652302112 +0000 UTC m=+64.834839864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:28.591336 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:28.591310 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrxcw" Apr 16 14:53:28.615570 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:28.615531 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m9n7j" podStartSLOduration=42.62585253 podStartE2EDuration="46.615518159s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:53:14.21275558 +0000 UTC m=+43.395293333" lastFinishedPulling="2026-04-16 14:53:18.20242121 +0000 UTC m=+47.384958962" observedRunningTime="2026-04-16 14:53:18.668729392 +0000 UTC m=+47.851267166" watchObservedRunningTime="2026-04-16 14:53:28.615518159 +0000 UTC m=+57.798055933" Apr 16 14:53:35.575223 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:35.575176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:35.575250 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.575350 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.575353 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.575460 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.575443 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.575422576 +0000 UTC m=+96.757960331 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:53:35.575730 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.575517 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.575504246 +0000 UTC m=+96.758041998 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:53:35.675809 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:35.675779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:53:35.675956 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:35.675944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:53:35.675956 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.675947 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:35.676041 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.676002 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.67598728 +0000 UTC m=+96.858525031 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:53:35.676041 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.676019 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:35.676114 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:35.676059 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.676048284 +0000 UTC m=+96.858586036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:36.079606 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.079567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:53:36.082094 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.082072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:36.090350 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:36.090317 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:36.090442 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:53:36.090395 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.09037651 +0000 UTC m=+129.272914262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : secret "metrics-daemon-secret" not found Apr 16 14:53:36.180255 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.180154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:36.182553 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.182534 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:36.193370 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.193354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:36.204564 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.204544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crm5b\" (UniqueName: \"kubernetes.io/projected/1609e3b0-3601-4e3b-bb3c-b89f06a20319-kube-api-access-crm5b\") pod \"network-check-target-qh75b\" (UID: \"1609e3b0-3601-4e3b-bb3c-b89f06a20319\") " pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:36.285083 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.285050 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z9txl\"" Apr 16 14:53:36.293014 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.292989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:36.404840 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.404817 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qh75b"] Apr 16 14:53:36.407235 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:53:36.407204 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1609e3b0_3601_4e3b_bb3c_b89f06a20319.slice/crio-3f1590a6c62b768f2dc101e3e24c736fba158f2c1bdeddda41d7d1085a4c957f WatchSource:0}: Error finding container 3f1590a6c62b768f2dc101e3e24c736fba158f2c1bdeddda41d7d1085a4c957f: Status 404 returned error can't find the container with id 3f1590a6c62b768f2dc101e3e24c736fba158f2c1bdeddda41d7d1085a4c957f Apr 16 14:53:36.700782 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:36.700686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qh75b" event={"ID":"1609e3b0-3601-4e3b-bb3c-b89f06a20319","Type":"ContainerStarted","Data":"3f1590a6c62b768f2dc101e3e24c736fba158f2c1bdeddda41d7d1085a4c957f"} Apr 16 14:53:39.710022 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:39.709982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qh75b" event={"ID":"1609e3b0-3601-4e3b-bb3c-b89f06a20319","Type":"ContainerStarted","Data":"f6be86d85900e0becd276a3301dcd29580ffe31c943c212bc66ad97655ea38f9"} Apr 16 14:53:39.710430 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:39.710098 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:53:39.723946 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:53:39.723881 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qh75b" podStartSLOduration=65.596569777 podStartE2EDuration="1m8.723866635s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:53:36.409122039 +0000 UTC m=+65.591659796" lastFinishedPulling="2026-04-16 14:53:39.536418898 +0000 UTC m=+68.718956654" observedRunningTime="2026-04-16 14:53:39.723714661 +0000 UTC m=+68.906252434" watchObservedRunningTime="2026-04-16 14:53:39.723866635 +0000 UTC m=+68.906404408" Apr 16 14:54:07.617321 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:07.617275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:07.617352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.617424 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.617435 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.617447 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.617493 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.617480604 +0000 UTC m=+160.800018356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:54:07.617750 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.617505 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.617499307 +0000 UTC m=+160.800037059 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:07.718624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:07.718683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.718865 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.718967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.718945272 +0000 UTC m=+160.901483047 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.719522 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:07.721221 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:07.719580 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls podName:545cbae5-e741-4260-bd9d-d9cf35cd5a5a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.719555624 +0000 UTC m=+160.902093375 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls") pod "dns-default-hfndr" (UID: "545cbae5-e741-4260-bd9d-d9cf35cd5a5a") : secret "dns-default-metrics-tls" not found Apr 16 14:54:10.714678 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:10.714649 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qh75b" Apr 16 14:54:40.168825 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:40.168788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:54:40.169319 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:40.168937 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:40.169319 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:54:40.169003 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs podName:ba7ca911-2481-4fe7-9079-e770b1840406 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:42.168983965 +0000 UTC m=+251.351521728 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs") pod "network-metrics-daemon-vpmk7" (UID: "ba7ca911-2481-4fe7-9079-e770b1840406") : secret "metrics-daemon-secret" not found Apr 16 14:54:55.996984 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:55.996954 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mvlp4_c0f8acc5-b88a-4adf-b2de-441b178001bf/dns-node-resolver/0.log" Apr 16 14:54:56.997707 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:54:56.997681 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dl2m4_da8fe018-bd6c-490e-9197-d5ea9c881a92/node-ca/0.log" Apr 16 14:55:06.707621 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:06.707582 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" podUID="75f5567a-e94c-421f-8e1b-be0c5c44e10e" Apr 16 14:55:06.723773 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:06.723735 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" podUID="26a2efd2-c6ce-46da-b1fe-6a739a8edd83" Apr 16 14:55:06.760691 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:06.760654 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kd7rp" podUID="91fb67d9-4995-4dfe-bc80-067575a9d732" Apr 16 14:55:06.826053 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:06.826005 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hfndr" podUID="545cbae5-e741-4260-bd9d-d9cf35cd5a5a" Apr 16 14:55:06.908678 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:06.908646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:55:06.908821 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:06.908646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:55:07.656115 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.656081 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ktl5m"] Apr 16 14:55:07.659031 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.659014 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.662955 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.662933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:07.663999 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.663976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2b2x9\"" Apr 16 14:55:07.664178 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.664027 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:07.664178 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.664069 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:07.664178 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.664106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:07.673186 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.673159 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ktl5m"] Apr 16 14:55:07.783684 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.783645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.784093 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.783694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c829937-28c4-4030-9425-91eda92303e3-data-volume\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.784093 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.783763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c829937-28c4-4030-9425-91eda92303e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.784093 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.783838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxfm5\" (UniqueName: \"kubernetes.io/projected/0c829937-28c4-4030-9425-91eda92303e3-kube-api-access-jxfm5\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.784093 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.783888 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c829937-28c4-4030-9425-91eda92303e3-crio-socket\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885148 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c829937-28c4-4030-9425-91eda92303e3-crio-socket\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885275 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885275 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c829937-28c4-4030-9425-91eda92303e3-crio-socket\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885275 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c829937-28c4-4030-9425-91eda92303e3-data-volume\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885377 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c829937-28c4-4030-9425-91eda92303e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885377 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxfm5\" (UniqueName: \"kubernetes.io/projected/0c829937-28c4-4030-9425-91eda92303e3-kube-api-access-jxfm5\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885377 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:07.885354 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.885485 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:07.885427 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls podName:0c829937-28c4-4030-9425-91eda92303e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.385411274 +0000 UTC m=+157.567949026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ktl5m" (UID: "0c829937-28c4-4030-9425-91eda92303e3") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.885663 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c829937-28c4-4030-9425-91eda92303e3-data-volume\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.885858 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.885840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c829937-28c4-4030-9425-91eda92303e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:07.894501 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:07.894482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxfm5\" (UniqueName: \"kubernetes.io/projected/0c829937-28c4-4030-9425-91eda92303e3-kube-api-access-jxfm5\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:08.371420 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:08.371384 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vpmk7" podUID="ba7ca911-2481-4fe7-9079-e770b1840406" Apr 16 14:55:08.388694 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:08.388671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:08.388864 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:08.388841 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:08.388973 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:08.388960 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls podName:0c829937-28c4-4030-9425-91eda92303e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:09.388938245 +0000 UTC m=+158.571476014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ktl5m" (UID: "0c829937-28c4-4030-9425-91eda92303e3") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.398318 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:09.398276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:09.398709 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:09.398461 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.398709 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:09.398551 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls podName:0c829937-28c4-4030-9425-91eda92303e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.398530194 +0000 UTC m=+160.581067966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ktl5m" (UID: "0c829937-28c4-4030-9425-91eda92303e3") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:10.919444 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.919410 2579 generic.go:358] "Generic (PLEG): container finished" podID="29711f9c-de92-459f-9949-54d43d314175" containerID="04c4342665c366459e04bb62fd9375f1e5af8a125b4789efedb97287ab8e27ef" exitCode=1 Apr 16 14:55:10.919867 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.919483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" event={"ID":"29711f9c-de92-459f-9949-54d43d314175","Type":"ContainerDied","Data":"04c4342665c366459e04bb62fd9375f1e5af8a125b4789efedb97287ab8e27ef"} Apr 16 14:55:10.919867 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.919856 2579 scope.go:117] "RemoveContainer" containerID="04c4342665c366459e04bb62fd9375f1e5af8a125b4789efedb97287ab8e27ef" Apr 16 14:55:10.920767 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.920744 2579 generic.go:358] "Generic (PLEG): container finished" podID="cabd1204-eddf-42ed-9bad-9f118298b94b" containerID="0b9ea46515a38cf8108003f29db3450fe9b3a0b2b63481afb2e560dbc597fd60" exitCode=255 Apr 16 14:55:10.920842 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.920797 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" event={"ID":"cabd1204-eddf-42ed-9bad-9f118298b94b","Type":"ContainerDied","Data":"0b9ea46515a38cf8108003f29db3450fe9b3a0b2b63481afb2e560dbc597fd60"} Apr 16 14:55:10.921174 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:10.921121 2579 scope.go:117] "RemoveContainer" containerID="0b9ea46515a38cf8108003f29db3450fe9b3a0b2b63481afb2e560dbc597fd60" Apr 16 14:55:11.413492 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.413458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:11.413673 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.413589 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:11.413673 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.413647 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls podName:0c829937-28c4-4030-9425-91eda92303e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.413632366 +0000 UTC m=+164.596170118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ktl5m" (UID: "0c829937-28c4-4030-9425-91eda92303e3") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:11.634240 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.634196 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:55:11.715457 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.715370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") pod \"image-registry-5d9f76899d-gd279\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:55:11.715457 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.715430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:55:11.715731 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.715532 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:55:11.715731 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.715587 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:11.715731 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.715609 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9f76899d-gd279: secret "image-registry-tls" not found Apr 16 14:55:11.715731 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.715595 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert podName:26a2efd2-c6ce-46da-b1fe-6a739a8edd83 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:13.715578083 +0000 UTC m=+282.898115836 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7st9s" (UID: "26a2efd2-c6ce-46da-b1fe-6a739a8edd83") : secret "networking-console-plugin-cert" not found Apr 16 14:55:11.715731 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.715673 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls podName:75f5567a-e94c-421f-8e1b-be0c5c44e10e nodeName:}" failed. No retries permitted until 2026-04-16 14:57:13.715657809 +0000 UTC m=+282.898195564 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls") pod "image-registry-5d9f76899d-gd279" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e") : secret "image-registry-tls" not found Apr 16 14:55:11.816633 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.816595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:11.816633 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.816634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:55:11.816997 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.816767 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:55:11.816997 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:11.816826 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert podName:91fb67d9-4995-4dfe-bc80-067575a9d732 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:13.816804976 +0000 UTC m=+282.999342729 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert") pod "ingress-canary-kd7rp" (UID: "91fb67d9-4995-4dfe-bc80-067575a9d732") : secret "canary-serving-cert" not found Apr 16 14:55:11.818976 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.818952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/545cbae5-e741-4260-bd9d-d9cf35cd5a5a-metrics-tls\") pod \"dns-default-hfndr\" (UID: \"545cbae5-e741-4260-bd9d-d9cf35cd5a5a\") " pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:11.924660 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.924627 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77fc8586d5-vtrgt" event={"ID":"cabd1204-eddf-42ed-9bad-9f118298b94b","Type":"ContainerStarted","Data":"f1be09f49ff9d8646a220dd81f0c53725aee1a04b335b66fc258b7609e64b885"} Apr 16 14:55:11.926147 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.926122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" event={"ID":"29711f9c-de92-459f-9949-54d43d314175","Type":"ContainerStarted","Data":"279cfd78f571cea3ad8f152e026bd99d946d8c23a1f1c746f13789da515b386b"} Apr 16 14:55:11.926355 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.926335 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:55:11.926947 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:11.926932 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd7f98bbb-69vqc" Apr 16 14:55:15.445070 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.445028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:15.447402 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.447369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c829937-28c4-4030-9425-91eda92303e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ktl5m\" (UID: \"0c829937-28c4-4030-9425-91eda92303e3\") " pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:15.468412 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.468375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ktl5m" Apr 16 14:55:15.584392 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.584355 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ktl5m"] Apr 16 14:55:15.587803 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:55:15.587776 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c829937_28c4_4030_9425_91eda92303e3.slice/crio-851e19c0246c09fe8ef933dedc4ed939d06aeb427824658d87e0137fa97cee34 WatchSource:0}: Error finding container 851e19c0246c09fe8ef933dedc4ed939d06aeb427824658d87e0137fa97cee34: Status 404 returned error can't find the container with id 851e19c0246c09fe8ef933dedc4ed939d06aeb427824658d87e0137fa97cee34 Apr 16 14:55:15.936175 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.936140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ktl5m" event={"ID":"0c829937-28c4-4030-9425-91eda92303e3","Type":"ContainerStarted","Data":"f9054526c3e080f194951d37c21ce15d3ae8d9e77734f6836c4d102217a9070a"} Apr 16 14:55:15.936175 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:15.936176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ktl5m" event={"ID":"0c829937-28c4-4030-9425-91eda92303e3","Type":"ContainerStarted","Data":"851e19c0246c09fe8ef933dedc4ed939d06aeb427824658d87e0137fa97cee34"} Apr 16 14:55:16.940219 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:16.940180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ktl5m" event={"ID":"0c829937-28c4-4030-9425-91eda92303e3","Type":"ContainerStarted","Data":"c4e6a101eae572688873c99d253ea44a176bcf3bd9aaba0ebdee14f6cecfa374"} Apr 16 14:55:17.946432 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:17.946334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ktl5m" event={"ID":"0c829937-28c4-4030-9425-91eda92303e3","Type":"ContainerStarted","Data":"6044272a067c6e9e277e98912008747251338ec82c2f8b9c22bd6bde394d1945"} Apr 16 14:55:17.965594 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:17.965549 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ktl5m" podStartSLOduration=8.942951546 podStartE2EDuration="10.965534366s" podCreationTimestamp="2026-04-16 14:55:07 +0000 UTC" firstStartedPulling="2026-04-16 14:55:15.643407764 +0000 UTC m=+164.825945516" lastFinishedPulling="2026-04-16 14:55:17.665990584 +0000 UTC m=+166.848528336" observedRunningTime="2026-04-16 14:55:17.964244497 +0000 UTC m=+167.146782270" watchObservedRunningTime="2026-04-16 14:55:17.965534366 +0000 UTC m=+167.148072139" Apr 16 14:55:19.350406 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:19.350315 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:19.353316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:19.353293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:55:19.361230 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:19.361206 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:19.477224 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:19.477194 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hfndr"] Apr 16 14:55:19.480587 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:55:19.480562 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545cbae5_e741_4260_bd9d_d9cf35cd5a5a.slice/crio-78021e655c088b82c642a9791934f818075e364ecfd283d98acbd96b60c334d6 WatchSource:0}: Error finding container 78021e655c088b82c642a9791934f818075e364ecfd283d98acbd96b60c334d6: Status 404 returned error can't find the container with id 78021e655c088b82c642a9791934f818075e364ecfd283d98acbd96b60c334d6 Apr 16 14:55:19.953451 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:19.953411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfndr" event={"ID":"545cbae5-e741-4260-bd9d-d9cf35cd5a5a","Type":"ContainerStarted","Data":"78021e655c088b82c642a9791934f818075e364ecfd283d98acbd96b60c334d6"} Apr 16 14:55:20.958114 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:20.958026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfndr" event={"ID":"545cbae5-e741-4260-bd9d-d9cf35cd5a5a","Type":"ContainerStarted","Data":"03d779383bfc8a86cc5a0af0d4537069494329856349c9d349c201ceebeaf471"} Apr 16 14:55:20.958114 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:20.958064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfndr" event={"ID":"545cbae5-e741-4260-bd9d-d9cf35cd5a5a","Type":"ContainerStarted","Data":"fcd594d37d34ae6b967b7825971d1ae8c6d30fb376826cee558ca870ab5a9d5d"} Apr 16 14:55:20.958520 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:20.958175 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:20.974468 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:20.974413 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hfndr" podStartSLOduration=136.830165322 podStartE2EDuration="2m17.974395512s" podCreationTimestamp="2026-04-16 14:53:03 +0000 UTC" firstStartedPulling="2026-04-16 14:55:19.48276814 +0000 UTC m=+168.665305893" lastFinishedPulling="2026-04-16 14:55:20.626998332 +0000 UTC m=+169.809536083" observedRunningTime="2026-04-16 14:55:20.973761086 +0000 UTC m=+170.156298855" watchObservedRunningTime="2026-04-16 14:55:20.974395512 +0000 UTC m=+170.156933286" Apr 16 14:55:21.352449 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:21.352407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:55:21.352661 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:21.352456 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:55:30.963301 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:30.963272 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hfndr" Apr 16 14:55:39.952499 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.952462 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jpjvx"] Apr 16 14:55:39.958894 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.958870 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:39.961387 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.961359 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:39.961626 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.961599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-48tx8\"" Apr 16 14:55:39.961750 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.961631 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:39.961750 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.961724 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:39.962491 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.962474 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:39.962553 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.962492 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:39.962553 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:39.962506 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:40.140645 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-metrics-client-ca\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.140645 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-root\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.140838 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.140838 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-wtmp\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.140838 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.140838 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-sys\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.141002 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.141002 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-textfile\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.141002 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.140971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjd8\" (UniqueName: \"kubernetes.io/projected/1a854354-6033-440e-91bf-c5287d20c508-kube-api-access-bjjd8\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.241837 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-wtmp\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.241837 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.241837 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-sys\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-textfile\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjd8\" (UniqueName: \"kubernetes.io/projected/1a854354-6033-440e-91bf-c5287d20c508-kube-api-access-bjjd8\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-wtmp\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-metrics-client-ca\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.241958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-sys\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-root\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a854354-6033-440e-91bf-c5287d20c508-root\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242426 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242426 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:40.242213 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:40.242426 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:40.242285 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls podName:1a854354-6033-440e-91bf-c5287d20c508 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:40.742265588 +0000 UTC m=+189.924803356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls") pod "node-exporter-jpjvx" (UID: "1a854354-6033-440e-91bf-c5287d20c508") : secret "node-exporter-tls" not found Apr 16 14:55:40.242426 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-textfile\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242622 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.242622 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.242480 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a854354-6033-440e-91bf-c5287d20c508-metrics-client-ca\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.244353 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.244334 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.250556 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.250536 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjd8\" (UniqueName: \"kubernetes.io/projected/1a854354-6033-440e-91bf-c5287d20c508-kube-api-access-bjjd8\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.746408 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.746364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.748685 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.748667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a854354-6033-440e-91bf-c5287d20c508-node-exporter-tls\") pod \"node-exporter-jpjvx\" (UID: \"1a854354-6033-440e-91bf-c5287d20c508\") " pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.867528 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:40.867493 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jpjvx" Apr 16 14:55:40.875313 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:55:40.875282 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a854354_6033_440e_91bf_c5287d20c508.slice/crio-dd56b6f238541512277fc2311382afb3ced6f46f550988fd649c0792fd009ed8 WatchSource:0}: Error finding container dd56b6f238541512277fc2311382afb3ced6f46f550988fd649c0792fd009ed8: Status 404 returned error can't find the container with id dd56b6f238541512277fc2311382afb3ced6f46f550988fd649c0792fd009ed8 Apr 16 14:55:41.011071 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:41.011024 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpjvx" event={"ID":"1a854354-6033-440e-91bf-c5287d20c508","Type":"ContainerStarted","Data":"dd56b6f238541512277fc2311382afb3ced6f46f550988fd649c0792fd009ed8"} Apr 16 14:55:42.015537 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:42.015500 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a854354-6033-440e-91bf-c5287d20c508" containerID="2e74cee38a8f562ef06641c951063430bbbb0896da5c9b42c6cb118c7929b41b" exitCode=0 Apr 16 14:55:42.015956 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:42.015574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpjvx" event={"ID":"1a854354-6033-440e-91bf-c5287d20c508","Type":"ContainerDied","Data":"2e74cee38a8f562ef06641c951063430bbbb0896da5c9b42c6cb118c7929b41b"} Apr 16 14:55:43.020442 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:43.020402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpjvx" event={"ID":"1a854354-6033-440e-91bf-c5287d20c508","Type":"ContainerStarted","Data":"7371b2ae8676eaafe0a55361b38098be565aaf90f959ba62bf4b91f9d172b251"} Apr 16 14:55:43.020442 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:43.020440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpjvx" event={"ID":"1a854354-6033-440e-91bf-c5287d20c508","Type":"ContainerStarted","Data":"a84b3f058bc1d1713a70f2c5ed3224e5949e9ddde6348ae5ec12ab615c7336d4"} Apr 16 14:55:43.040634 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:43.040592 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jpjvx" podStartSLOduration=3.321876547 podStartE2EDuration="4.040579198s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="2026-04-16 14:55:40.877226429 +0000 UTC m=+190.059764182" lastFinishedPulling="2026-04-16 14:55:41.595929074 +0000 UTC m=+190.778466833" observedRunningTime="2026-04-16 14:55:43.03922776 +0000 UTC m=+192.221765535" watchObservedRunningTime="2026-04-16 14:55:43.040579198 +0000 UTC m=+192.223116971" Apr 16 14:55:44.085654 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:44.085602 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" podUID="59a9063e-512b-4e8e-9292-b64616eb9a33" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:52.489165 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:52.489133 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d9f76899d-gd279"] Apr 16 14:55:52.489556 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:55:52.489353 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" podUID="75f5567a-e94c-421f-8e1b-be0c5c44e10e" Apr 16 14:55:53.045235 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.045204 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:55:53.049285 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.049264 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:55:53.145297 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145242 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145297 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145314 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145544 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145344 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145544 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145367 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4fv7\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145544 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145389 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145544 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145406 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145544 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145424 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted\") pod \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\" (UID: \"75f5567a-e94c-421f-8e1b-be0c5c44e10e\") " Apr 16 14:55:53.145874 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145840 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:53.146000 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145840 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:53.146063 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.145993 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:53.147681 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.147649 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:53.147787 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.147696 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:53.147869 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.147848 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:53.147956 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.147932 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7" (OuterVolumeSpecName: "kube-api-access-p4fv7") pod "75f5567a-e94c-421f-8e1b-be0c5c44e10e" (UID: "75f5567a-e94c-421f-8e1b-be0c5c44e10e"). InnerVolumeSpecName "kube-api-access-p4fv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:53.247005 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.246967 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-image-registry-private-configuration\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247005 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.246996 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-certificates\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247005 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.247010 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f5567a-e94c-421f-8e1b-be0c5c44e10e-trusted-ca\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247228 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.247020 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4fv7\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-kube-api-access-p4fv7\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247228 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.247029 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-bound-sa-token\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247228 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.247038 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f5567a-e94c-421f-8e1b-be0c5c44e10e-installation-pull-secrets\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:53.247228 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:53.247047 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f5567a-e94c-421f-8e1b-be0c5c44e10e-ca-trust-extracted\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:54.047799 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:54.047763 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9f76899d-gd279" Apr 16 14:55:54.081346 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:54.081313 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d9f76899d-gd279"] Apr 16 14:55:54.084968 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:54.084937 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" podUID="59a9063e-512b-4e8e-9292-b64616eb9a33" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:54.085722 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:54.085704 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d9f76899d-gd279"] Apr 16 14:55:54.155091 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:54.155058 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f5567a-e94c-421f-8e1b-be0c5c44e10e-registry-tls\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.353768 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:55:55.353737 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f5567a-e94c-421f-8e1b-be0c5c44e10e" path="/var/lib/kubelet/pods/75f5567a-e94c-421f-8e1b-be0c5c44e10e/volumes" Apr 16 14:56:04.085455 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:04.085412 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" podUID="59a9063e-512b-4e8e-9292-b64616eb9a33" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:56:04.085929 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:04.085493 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" Apr 16 14:56:04.086028 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:04.085996 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"19f427fe79e3ddd04f2c8d6e1502d5058e17ded6839a52f772bcf44996b23ed8"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:56:04.086066 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:04.086048 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" podUID="59a9063e-512b-4e8e-9292-b64616eb9a33" containerName="service-proxy" containerID="cri-o://19f427fe79e3ddd04f2c8d6e1502d5058e17ded6839a52f772bcf44996b23ed8" gracePeriod=30 Apr 16 14:56:05.075503 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:05.075462 2579 generic.go:358] "Generic (PLEG): container finished" podID="59a9063e-512b-4e8e-9292-b64616eb9a33" containerID="19f427fe79e3ddd04f2c8d6e1502d5058e17ded6839a52f772bcf44996b23ed8" exitCode=2 Apr 16 14:56:05.075650 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:05.075529 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerDied","Data":"19f427fe79e3ddd04f2c8d6e1502d5058e17ded6839a52f772bcf44996b23ed8"} Apr 16 14:56:05.075650 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:05.075567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b7b574cf-sbk4t" event={"ID":"59a9063e-512b-4e8e-9292-b64616eb9a33","Type":"ContainerStarted","Data":"7907a15c464d0991879375297c3ca0f05a687454bbb4ac78439dddc6fd5d2b08"} Apr 16 14:56:42.233566 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:42.233524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:56:42.235969 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:42.235945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca911-2481-4fe7-9079-e770b1840406-metrics-certs\") pod \"network-metrics-daemon-vpmk7\" (UID: \"ba7ca911-2481-4fe7-9079-e770b1840406\") " pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:56:42.355570 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:42.355533 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:56:42.363367 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:42.363343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vpmk7" Apr 16 14:56:42.474526 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:42.474495 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vpmk7"] Apr 16 14:56:42.479400 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:56:42.479359 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7ca911_2481_4fe7_9079_e770b1840406.slice/crio-0d4bb720cb425cc2f78cb6e5adbed808b572f903f45be122f7b8f531e64afa01 WatchSource:0}: Error finding container 0d4bb720cb425cc2f78cb6e5adbed808b572f903f45be122f7b8f531e64afa01: Status 404 returned error can't find the container with id 0d4bb720cb425cc2f78cb6e5adbed808b572f903f45be122f7b8f531e64afa01 Apr 16 14:56:43.179316 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:43.179277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vpmk7" event={"ID":"ba7ca911-2481-4fe7-9079-e770b1840406","Type":"ContainerStarted","Data":"0d4bb720cb425cc2f78cb6e5adbed808b572f903f45be122f7b8f531e64afa01"} Apr 16 14:56:44.185493 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:44.185454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vpmk7" event={"ID":"ba7ca911-2481-4fe7-9079-e770b1840406","Type":"ContainerStarted","Data":"6c037c1a81c1d0b6f9237fce2a73af0c6e8e4bd1e144836d97dc910bee175e12"} Apr 16 14:56:44.185493 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:56:44.185495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vpmk7" event={"ID":"ba7ca911-2481-4fe7-9079-e770b1840406","Type":"ContainerStarted","Data":"8d276a9a43ca9165654719ed111f5d7c77eab65dbd7cb54da86c6c0ae60c58d7"} Apr 16 14:57:09.909505 ip-10-0-139-47 kubenswrapper[2579]: E0416 14:57:09.909458 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kd7rp" podUID="91fb67d9-4995-4dfe-bc80-067575a9d732" Apr 16 14:57:10.251603 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:10.251574 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:57:13.788592 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.788551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:57:13.791022 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.791002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26a2efd2-c6ce-46da-b1fe-6a739a8edd83-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7st9s\" (UID: \"26a2efd2-c6ce-46da-b1fe-6a739a8edd83\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:57:13.856192 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.856162 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xpsp2\"" Apr 16 14:57:13.863995 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.863974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" Apr 16 14:57:13.889197 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.889166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:57:13.891366 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.891348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91fb67d9-4995-4dfe-bc80-067575a9d732-cert\") pod \"ingress-canary-kd7rp\" (UID: \"91fb67d9-4995-4dfe-bc80-067575a9d732\") " pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:57:13.976600 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.976491 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vpmk7" podStartSLOduration=282.103112727 podStartE2EDuration="4m42.976471971s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:56:42.480884388 +0000 UTC m=+251.663422141" lastFinishedPulling="2026-04-16 14:56:43.354243633 +0000 UTC m=+252.536781385" observedRunningTime="2026-04-16 14:56:44.200691219 +0000 UTC m=+253.383228995" watchObservedRunningTime="2026-04-16 14:57:13.976471971 +0000 UTC m=+283.159009749" Apr 16 14:57:13.977277 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:13.977260 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s"] Apr 16 14:57:13.980288 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:57:13.980259 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26a2efd2_c6ce_46da_b1fe_6a739a8edd83.slice/crio-b3b04f3ffc52f399d14a9ac4f31eceb104241642f2280ec5a0e3afe54cd42746 WatchSource:0}: Error finding container b3b04f3ffc52f399d14a9ac4f31eceb104241642f2280ec5a0e3afe54cd42746: Status 404 returned error can't find the container with id b3b04f3ffc52f399d14a9ac4f31eceb104241642f2280ec5a0e3afe54cd42746 Apr 16 14:57:14.155311 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:14.155228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:57:14.163166 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:14.163143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kd7rp" Apr 16 14:57:14.262446 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:14.262414 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" event={"ID":"26a2efd2-c6ce-46da-b1fe-6a739a8edd83","Type":"ContainerStarted","Data":"b3b04f3ffc52f399d14a9ac4f31eceb104241642f2280ec5a0e3afe54cd42746"} Apr 16 14:57:14.278880 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:14.278739 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kd7rp"] Apr 16 14:57:14.281323 ip-10-0-139-47 kubenswrapper[2579]: W0416 14:57:14.281303 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91fb67d9_4995_4dfe_bc80_067575a9d732.slice/crio-55b54ab57bdea7444e70f22af78add99e7bedc52d6ec943eaacd9baa33e8e713 WatchSource:0}: Error finding container 55b54ab57bdea7444e70f22af78add99e7bedc52d6ec943eaacd9baa33e8e713: Status 404 returned error can't find the container with id 55b54ab57bdea7444e70f22af78add99e7bedc52d6ec943eaacd9baa33e8e713 Apr 16 14:57:15.267482 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:15.267444 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" event={"ID":"26a2efd2-c6ce-46da-b1fe-6a739a8edd83","Type":"ContainerStarted","Data":"fc307da6ff0b16bb3daf82890070da279f77d9b18bccef70b10f79affd116224"} Apr 16 14:57:15.268686 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:15.268653 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kd7rp" event={"ID":"91fb67d9-4995-4dfe-bc80-067575a9d732","Type":"ContainerStarted","Data":"55b54ab57bdea7444e70f22af78add99e7bedc52d6ec943eaacd9baa33e8e713"} Apr 16 14:57:15.286656 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:15.286596 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7st9s" podStartSLOduration=262.148960564 podStartE2EDuration="4m23.286576219s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:57:13.982094319 +0000 UTC m=+283.164632071" lastFinishedPulling="2026-04-16 14:57:15.119709959 +0000 UTC m=+284.302247726" observedRunningTime="2026-04-16 14:57:15.285440587 +0000 UTC m=+284.467978362" watchObservedRunningTime="2026-04-16 14:57:15.286576219 +0000 UTC m=+284.469113994" Apr 16 14:57:16.272680 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:16.272644 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kd7rp" event={"ID":"91fb67d9-4995-4dfe-bc80-067575a9d732","Type":"ContainerStarted","Data":"6b19a2b932c8821be3bc294586ad3576adec058ec16f7d42c934b16d9a64c914"} Apr 16 14:57:16.289672 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:16.289605 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kd7rp" podStartSLOduration=251.708187468 podStartE2EDuration="4m13.289588325s" podCreationTimestamp="2026-04-16 14:53:03 +0000 UTC" firstStartedPulling="2026-04-16 14:57:14.283146186 +0000 UTC m=+283.465683939" lastFinishedPulling="2026-04-16 14:57:15.864547044 +0000 UTC m=+285.047084796" observedRunningTime="2026-04-16 14:57:16.287872305 +0000 UTC m=+285.470410080" watchObservedRunningTime="2026-04-16 14:57:16.289588325 +0000 UTC m=+285.472126151" Apr 16 14:57:31.274806 ip-10-0-139-47 kubenswrapper[2579]: I0416 14:57:31.274779 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:02:36.016261 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.016225 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5"] Apr 16 15:02:36.019135 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.019114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.022161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.022139 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:02:36.022267 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.022140 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 15:02:36.022758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.022743 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-tfgp9\"" Apr 16 15:02:36.031939 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.031918 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5"] Apr 16 15:02:36.146595 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.146557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c474a557-312e-428d-99fb-a62bd603471c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.146595 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.146605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j97h\" (UniqueName: \"kubernetes.io/projected/c474a557-312e-428d-99fb-a62bd603471c-kube-api-access-9j97h\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.247364 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.247324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c474a557-312e-428d-99fb-a62bd603471c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.247364 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.247370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j97h\" (UniqueName: \"kubernetes.io/projected/c474a557-312e-428d-99fb-a62bd603471c-kube-api-access-9j97h\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.247726 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.247708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c474a557-312e-428d-99fb-a62bd603471c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.256055 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.256032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j97h\" (UniqueName: \"kubernetes.io/projected/c474a557-312e-428d-99fb-a62bd603471c-kube-api-access-9j97h\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-255z5\" (UID: \"c474a557-312e-428d-99fb-a62bd603471c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.328109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.328012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" Apr 16 15:02:36.452658 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.452612 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5"] Apr 16 15:02:36.456429 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:02:36.456400 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc474a557_312e_428d_99fb_a62bd603471c.slice/crio-ad5130a9da195a5d96dab256d27454dbf0ccb001bb9751f5edecd10a79d30725 WatchSource:0}: Error finding container ad5130a9da195a5d96dab256d27454dbf0ccb001bb9751f5edecd10a79d30725: Status 404 returned error can't find the container with id ad5130a9da195a5d96dab256d27454dbf0ccb001bb9751f5edecd10a79d30725 Apr 16 15:02:36.458852 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:36.458836 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:02:37.082473 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:37.082433 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" event={"ID":"c474a557-312e-428d-99fb-a62bd603471c","Type":"ContainerStarted","Data":"ad5130a9da195a5d96dab256d27454dbf0ccb001bb9751f5edecd10a79d30725"} Apr 16 15:02:40.096703 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:40.096663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" event={"ID":"c474a557-312e-428d-99fb-a62bd603471c","Type":"ContainerStarted","Data":"76ffbb8bad553eb49ec796df0fa40e394411e04452cb81ff7e2db7d380b8c239"} Apr 16 15:02:40.118819 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:40.118635 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-255z5" podStartSLOduration=2.36301102 podStartE2EDuration="5.118614828s" podCreationTimestamp="2026-04-16 15:02:35 +0000 UTC" firstStartedPulling="2026-04-16 15:02:36.458988975 +0000 UTC m=+605.641526726" lastFinishedPulling="2026-04-16 15:02:39.214592771 +0000 UTC m=+608.397130534" observedRunningTime="2026-04-16 15:02:40.116363017 +0000 UTC m=+609.298900793" watchObservedRunningTime="2026-04-16 15:02:40.118614828 +0000 UTC m=+609.301152603" Apr 16 15:02:45.991450 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:45.991412 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2rmtn"] Apr 16 15:02:45.994693 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:45.994676 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:45.997152 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:45.997129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 15:02:45.997152 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:45.997129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 15:02:45.997396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:45.997243 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-q4bmp\"" Apr 16 15:02:46.002608 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.002578 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2rmtn"] Apr 16 15:02:46.120432 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.120400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s8ft\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-kube-api-access-9s8ft\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.120619 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.120469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.221632 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.221598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.221832 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.221643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s8ft\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-kube-api-access-9s8ft\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.229716 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.229685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.229870 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.229851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s8ft\" (UniqueName: \"kubernetes.io/projected/d6cce86f-ea83-4e2c-8161-b1f4381aa9d5-kube-api-access-9s8ft\") pod \"cert-manager-cainjector-8966b78d4-2rmtn\" (UID: \"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.304769 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.304729 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" Apr 16 15:02:46.429757 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:46.429728 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2rmtn"] Apr 16 15:02:46.432542 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:02:46.432513 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6cce86f_ea83_4e2c_8161_b1f4381aa9d5.slice/crio-18d0457a072320c0f9681b8fc13294b718cab3a0d79cb44a49f1f2eafd7ec521 WatchSource:0}: Error finding container 18d0457a072320c0f9681b8fc13294b718cab3a0d79cb44a49f1f2eafd7ec521: Status 404 returned error can't find the container with id 18d0457a072320c0f9681b8fc13294b718cab3a0d79cb44a49f1f2eafd7ec521 Apr 16 15:02:47.116490 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:47.116453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" event={"ID":"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5","Type":"ContainerStarted","Data":"18d0457a072320c0f9681b8fc13294b718cab3a0d79cb44a49f1f2eafd7ec521"} Apr 16 15:02:50.125957 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:50.125894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" event={"ID":"d6cce86f-ea83-4e2c-8161-b1f4381aa9d5","Type":"ContainerStarted","Data":"824ea4dc0ab99d8750c9c1f00f794dbf4106ea28103d756daa21d97540145a6d"} Apr 16 15:02:50.141751 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:50.141701 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-2rmtn" podStartSLOduration=1.891580512 podStartE2EDuration="5.141682095s" podCreationTimestamp="2026-04-16 15:02:45 +0000 UTC" firstStartedPulling="2026-04-16 15:02:46.434478822 +0000 UTC m=+615.617016581" lastFinishedPulling="2026-04-16 15:02:49.684580409 +0000 UTC m=+618.867118164" observedRunningTime="2026-04-16 15:02:50.140727158 +0000 UTC m=+619.323264942" watchObservedRunningTime="2026-04-16 15:02:50.141682095 +0000 UTC m=+619.324219870" Apr 16 15:02:52.016641 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.016606 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-9rdfr"] Apr 16 15:02:52.019730 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.019714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.022041 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.022017 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-bjlj5\"" Apr 16 15:02:52.027331 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.027291 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-9rdfr"] Apr 16 15:02:52.166258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.166220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7z4w\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-kube-api-access-x7z4w\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.166258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.166257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-bound-sa-token\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.266932 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.266839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7z4w\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-kube-api-access-x7z4w\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.266932 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.266873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-bound-sa-token\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.274449 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.274410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-bound-sa-token\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.274449 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.274419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7z4w\" (UniqueName: \"kubernetes.io/projected/325ef061-8493-4eda-a73b-3b9cd7a42f36-kube-api-access-x7z4w\") pod \"cert-manager-759f64656b-9rdfr\" (UID: \"325ef061-8493-4eda-a73b-3b9cd7a42f36\") " pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.329527 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.329483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-9rdfr" Apr 16 15:02:52.446888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:52.446853 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-9rdfr"] Apr 16 15:02:52.449787 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:02:52.449757 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod325ef061_8493_4eda_a73b_3b9cd7a42f36.slice/crio-1c1464f54d6a49e06351088f4555e667c412d61f835e7f357fb57563378e8588 WatchSource:0}: Error finding container 1c1464f54d6a49e06351088f4555e667c412d61f835e7f357fb57563378e8588: Status 404 returned error can't find the container with id 1c1464f54d6a49e06351088f4555e667c412d61f835e7f357fb57563378e8588 Apr 16 15:02:53.136254 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:53.136217 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-9rdfr" event={"ID":"325ef061-8493-4eda-a73b-3b9cd7a42f36","Type":"ContainerStarted","Data":"d7d22d4e9135e5053c5c14f3427df4e60c59c39793200bcb0181bdd620355249"} Apr 16 15:02:53.136254 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:53.136255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-9rdfr" event={"ID":"325ef061-8493-4eda-a73b-3b9cd7a42f36","Type":"ContainerStarted","Data":"1c1464f54d6a49e06351088f4555e667c412d61f835e7f357fb57563378e8588"} Apr 16 15:02:53.155120 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:02:53.155068 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-9rdfr" podStartSLOduration=1.15505213 podStartE2EDuration="1.15505213s" podCreationTimestamp="2026-04-16 15:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:02:53.153185004 +0000 UTC m=+622.335722778" watchObservedRunningTime="2026-04-16 15:02:53.15505213 +0000 UTC m=+622.337589903" Apr 16 15:03:16.646319 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.646282 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6"] Apr 16 15:03:16.653689 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.653662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.656578 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.656544 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:03:16.657190 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657166 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6"] Apr 16 15:03:16.657301 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657267 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 15:03:16.657301 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657276 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 15:03:16.657301 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657294 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 15:03:16.657446 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657277 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 15:03:16.657446 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.657268 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-jtnv8\"" Apr 16 15:03:16.747286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.747252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.747459 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.747305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/30a313db-687c-4458-9be0-9d0c90a93236-manager-config\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.747459 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.747359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r449k\" (UniqueName: \"kubernetes.io/projected/30a313db-687c-4458-9be0-9d0c90a93236-kube-api-access-r449k\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.747459 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.747396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.848149 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.848109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.848149 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.848154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.848383 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.848188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/30a313db-687c-4458-9be0-9d0c90a93236-manager-config\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.848383 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.848223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r449k\" (UniqueName: \"kubernetes.io/projected/30a313db-687c-4458-9be0-9d0c90a93236-kube-api-access-r449k\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.848877 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.848851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/30a313db-687c-4458-9be0-9d0c90a93236-manager-config\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.850597 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.850566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.850713 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.850676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a313db-687c-4458-9be0-9d0c90a93236-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.855137 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.855115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r449k\" (UniqueName: \"kubernetes.io/projected/30a313db-687c-4458-9be0-9d0c90a93236-kube-api-access-r449k\") pod \"lws-controller-manager-5bfb495c77-6ljq6\" (UID: \"30a313db-687c-4458-9be0-9d0c90a93236\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:16.964216 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:16.964116 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:17.081484 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:17.081446 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6"] Apr 16 15:03:17.084539 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:03:17.084510 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a313db_687c_4458_9be0_9d0c90a93236.slice/crio-ce4b70d6acf4331378838b42df68f300e39b2ef6445f7dc25f847736e42e8386 WatchSource:0}: Error finding container ce4b70d6acf4331378838b42df68f300e39b2ef6445f7dc25f847736e42e8386: Status 404 returned error can't find the container with id ce4b70d6acf4331378838b42df68f300e39b2ef6445f7dc25f847736e42e8386 Apr 16 15:03:17.204426 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:17.204389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" event={"ID":"30a313db-687c-4458-9be0-9d0c90a93236","Type":"ContainerStarted","Data":"ce4b70d6acf4331378838b42df68f300e39b2ef6445f7dc25f847736e42e8386"} Apr 16 15:03:21.219077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:21.218987 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" event={"ID":"30a313db-687c-4458-9be0-9d0c90a93236","Type":"ContainerStarted","Data":"46f6f335be1b5346d6bc98da1793ddf09efccf9e9dd2eb94b3aa6754301534e6"} Apr 16 15:03:21.219428 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:21.219107 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:03:21.236490 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:21.236439 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" podStartSLOduration=1.474784501 podStartE2EDuration="5.236423753s" podCreationTimestamp="2026-04-16 15:03:16 +0000 UTC" firstStartedPulling="2026-04-16 15:03:17.086422149 +0000 UTC m=+646.268959900" lastFinishedPulling="2026-04-16 15:03:20.848061391 +0000 UTC m=+650.030599152" observedRunningTime="2026-04-16 15:03:21.234696837 +0000 UTC m=+650.417234636" watchObservedRunningTime="2026-04-16 15:03:21.236423753 +0000 UTC m=+650.418961526" Apr 16 15:03:32.224644 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:03:32.224612 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-6ljq6" Apr 16 15:04:17.565388 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.565351 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xb67r"] Apr 16 15:04:17.569478 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.569459 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:17.572482 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.572452 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-f9n4l\"" Apr 16 15:04:17.572621 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.572452 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:04:17.573370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.573350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:04:17.581580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.581550 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xb67r"] Apr 16 15:04:17.680557 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.680522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w6z5\" (UniqueName: \"kubernetes.io/projected/6e2062c3-c136-4b9a-a040-c9f6c9c23a3c-kube-api-access-6w6z5\") pod \"authorino-operator-7587b89b76-xb67r\" (UID: \"6e2062c3-c136-4b9a-a040-c9f6c9c23a3c\") " pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:17.781034 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.780983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w6z5\" (UniqueName: \"kubernetes.io/projected/6e2062c3-c136-4b9a-a040-c9f6c9c23a3c-kube-api-access-6w6z5\") pod \"authorino-operator-7587b89b76-xb67r\" (UID: \"6e2062c3-c136-4b9a-a040-c9f6c9c23a3c\") " pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:17.796365 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.796329 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w6z5\" (UniqueName: \"kubernetes.io/projected/6e2062c3-c136-4b9a-a040-c9f6c9c23a3c-kube-api-access-6w6z5\") pod \"authorino-operator-7587b89b76-xb67r\" (UID: \"6e2062c3-c136-4b9a-a040-c9f6c9c23a3c\") " pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:17.880130 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:17.880032 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:18.006048 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:18.006023 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xb67r"] Apr 16 15:04:18.008606 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:04:18.008577 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2062c3_c136_4b9a_a040_c9f6c9c23a3c.slice/crio-f2c4e374f95b12b13846080ae5fb66afec96de9b4bc9be426d5450a74ebee604 WatchSource:0}: Error finding container f2c4e374f95b12b13846080ae5fb66afec96de9b4bc9be426d5450a74ebee604: Status 404 returned error can't find the container with id f2c4e374f95b12b13846080ae5fb66afec96de9b4bc9be426d5450a74ebee604 Apr 16 15:04:18.377235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:18.377193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" event={"ID":"6e2062c3-c136-4b9a-a040-c9f6c9c23a3c","Type":"ContainerStarted","Data":"f2c4e374f95b12b13846080ae5fb66afec96de9b4bc9be426d5450a74ebee604"} Apr 16 15:04:21.388442 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:21.388405 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" event={"ID":"6e2062c3-c136-4b9a-a040-c9f6c9c23a3c","Type":"ContainerStarted","Data":"e194cea829ffc179539f6136ca1721501a244f9b1e777e064af874d6290de388"} Apr 16 15:04:21.388930 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:21.388548 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:04:21.412539 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:21.412490 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" podStartSLOduration=1.966922162 podStartE2EDuration="4.412477944s" podCreationTimestamp="2026-04-16 15:04:17 +0000 UTC" firstStartedPulling="2026-04-16 15:04:18.010665152 +0000 UTC m=+707.193202904" lastFinishedPulling="2026-04-16 15:04:20.456220929 +0000 UTC m=+709.638758686" observedRunningTime="2026-04-16 15:04:21.410450618 +0000 UTC m=+710.592988403" watchObservedRunningTime="2026-04-16 15:04:21.412477944 +0000 UTC m=+710.595015717" Apr 16 15:04:32.394586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:04:32.394556 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-xb67r" Apr 16 15:05:03.739977 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.739942 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:03.744367 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.744343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:03.746741 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.746719 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-l9fgz\"" Apr 16 15:05:03.751511 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.751165 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:03.836439 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.836397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5kq4\" (UniqueName: \"kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4\") pod \"authorino-674b59b84c-qgvn7\" (UID: \"2157d8f8-dfad-464b-93ce-4c9a59f083a2\") " pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:03.937149 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.937116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5kq4\" (UniqueName: \"kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4\") pod \"authorino-674b59b84c-qgvn7\" (UID: \"2157d8f8-dfad-464b-93ce-4c9a59f083a2\") " pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:03.937307 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.937243 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:03.942079 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.941643 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:03.948492 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.948469 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:03.951028 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:03.951009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5kq4\" (UniqueName: \"kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4\") pod \"authorino-674b59b84c-qgvn7\" (UID: \"2157d8f8-dfad-464b-93ce-4c9a59f083a2\") " pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:04.038037 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.037995 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshhp\" (UniqueName: \"kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp\") pod \"authorino-79cbc94b89-tgh29\" (UID: \"e80d932d-c251-44f6-8456-92ed13714c74\") " pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:04.054729 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.054703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:04.139447 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.139415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wshhp\" (UniqueName: \"kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp\") pod \"authorino-79cbc94b89-tgh29\" (UID: \"e80d932d-c251-44f6-8456-92ed13714c74\") " pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:04.147571 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.147540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshhp\" (UniqueName: \"kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp\") pod \"authorino-79cbc94b89-tgh29\" (UID: \"e80d932d-c251-44f6-8456-92ed13714c74\") " pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:04.169626 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.169594 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:04.172680 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:05:04.172642 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2157d8f8_dfad_464b_93ce_4c9a59f083a2.slice/crio-fb7528ff29de9fd24b959b023ec1a8ac65cc6b711883c2208366b61f677535fd WatchSource:0}: Error finding container fb7528ff29de9fd24b959b023ec1a8ac65cc6b711883c2208366b61f677535fd: Status 404 returned error can't find the container with id fb7528ff29de9fd24b959b023ec1a8ac65cc6b711883c2208366b61f677535fd Apr 16 15:05:04.251220 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.251178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:04.364957 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.364925 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:04.367593 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:05:04.367566 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode80d932d_c251_44f6_8456_92ed13714c74.slice/crio-a972f80b0d62bbf00276b0a79bed8160e343a082775ffbc53ca5b1e3218080c9 WatchSource:0}: Error finding container a972f80b0d62bbf00276b0a79bed8160e343a082775ffbc53ca5b1e3218080c9: Status 404 returned error can't find the container with id a972f80b0d62bbf00276b0a79bed8160e343a082775ffbc53ca5b1e3218080c9 Apr 16 15:05:04.526750 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.526711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qgvn7" event={"ID":"2157d8f8-dfad-464b-93ce-4c9a59f083a2","Type":"ContainerStarted","Data":"fb7528ff29de9fd24b959b023ec1a8ac65cc6b711883c2208366b61f677535fd"} Apr 16 15:05:04.527796 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:04.527769 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tgh29" event={"ID":"e80d932d-c251-44f6-8456-92ed13714c74","Type":"ContainerStarted","Data":"a972f80b0d62bbf00276b0a79bed8160e343a082775ffbc53ca5b1e3218080c9"} Apr 16 15:05:07.539994 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:07.539950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qgvn7" event={"ID":"2157d8f8-dfad-464b-93ce-4c9a59f083a2","Type":"ContainerStarted","Data":"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913"} Apr 16 15:05:07.541026 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:07.541008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tgh29" event={"ID":"e80d932d-c251-44f6-8456-92ed13714c74","Type":"ContainerStarted","Data":"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697"} Apr 16 15:05:07.555357 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:07.555316 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-qgvn7" podStartSLOduration=1.709473166 podStartE2EDuration="4.555305169s" podCreationTimestamp="2026-04-16 15:05:03 +0000 UTC" firstStartedPulling="2026-04-16 15:05:04.174204343 +0000 UTC m=+753.356742095" lastFinishedPulling="2026-04-16 15:05:07.020036343 +0000 UTC m=+756.202574098" observedRunningTime="2026-04-16 15:05:07.55364442 +0000 UTC m=+756.736182195" watchObservedRunningTime="2026-04-16 15:05:07.555305169 +0000 UTC m=+756.737842942" Apr 16 15:05:07.567778 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:07.567732 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-tgh29" podStartSLOduration=1.926438236 podStartE2EDuration="4.567720631s" podCreationTimestamp="2026-04-16 15:05:03 +0000 UTC" firstStartedPulling="2026-04-16 15:05:04.368891415 +0000 UTC m=+753.551429166" lastFinishedPulling="2026-04-16 15:05:07.01017381 +0000 UTC m=+756.192711561" observedRunningTime="2026-04-16 15:05:07.566286982 +0000 UTC m=+756.748824759" watchObservedRunningTime="2026-04-16 15:05:07.567720631 +0000 UTC m=+756.750258406" Apr 16 15:05:07.592848 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:07.592816 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:09.546514 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:09.546472 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-qgvn7" podUID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" containerName="authorino" containerID="cri-o://4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913" gracePeriod=30 Apr 16 15:05:09.775474 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:09.775451 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:09.885552 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:09.885469 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5kq4\" (UniqueName: \"kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4\") pod \"2157d8f8-dfad-464b-93ce-4c9a59f083a2\" (UID: \"2157d8f8-dfad-464b-93ce-4c9a59f083a2\") " Apr 16 15:05:09.887608 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:09.887585 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4" (OuterVolumeSpecName: "kube-api-access-w5kq4") pod "2157d8f8-dfad-464b-93ce-4c9a59f083a2" (UID: "2157d8f8-dfad-464b-93ce-4c9a59f083a2"). InnerVolumeSpecName "kube-api-access-w5kq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:09.987042 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:09.986988 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5kq4\" (UniqueName: \"kubernetes.io/projected/2157d8f8-dfad-464b-93ce-4c9a59f083a2-kube-api-access-w5kq4\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:05:10.550116 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.550083 2579 generic.go:358] "Generic (PLEG): container finished" podID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" containerID="4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913" exitCode=0 Apr 16 15:05:10.550507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.550136 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qgvn7" Apr 16 15:05:10.550507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.550153 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qgvn7" event={"ID":"2157d8f8-dfad-464b-93ce-4c9a59f083a2","Type":"ContainerDied","Data":"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913"} Apr 16 15:05:10.550507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.550181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qgvn7" event={"ID":"2157d8f8-dfad-464b-93ce-4c9a59f083a2","Type":"ContainerDied","Data":"fb7528ff29de9fd24b959b023ec1a8ac65cc6b711883c2208366b61f677535fd"} Apr 16 15:05:10.550507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.550196 2579 scope.go:117] "RemoveContainer" containerID="4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913" Apr 16 15:05:10.558119 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.558097 2579 scope.go:117] "RemoveContainer" containerID="4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913" Apr 16 15:05:10.558365 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:05:10.558344 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913\": container with ID starting with 4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913 not found: ID does not exist" containerID="4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913" Apr 16 15:05:10.558430 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.558374 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913"} err="failed to get container status \"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913\": rpc error: code = NotFound desc = could not find container \"4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913\": container with ID starting with 4f94ce5592b8637380122751552309ef7372f4dada73afcaafa7c112fb43a913 not found: ID does not exist" Apr 16 15:05:10.569766 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.569735 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:10.573424 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:10.573402 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qgvn7"] Apr 16 15:05:11.353604 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:11.353568 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" path="/var/lib/kubelet/pods/2157d8f8-dfad-464b-93ce-4c9a59f083a2/volumes" Apr 16 15:05:27.593609 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:27.593573 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:27.593993 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:27.593786 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-tgh29" podUID="e80d932d-c251-44f6-8456-92ed13714c74" containerName="authorino" containerID="cri-o://440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697" gracePeriod=30 Apr 16 15:05:27.837827 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:27.837802 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:27.915726 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:27.915652 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshhp\" (UniqueName: \"kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp\") pod \"e80d932d-c251-44f6-8456-92ed13714c74\" (UID: \"e80d932d-c251-44f6-8456-92ed13714c74\") " Apr 16 15:05:27.917803 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:27.917765 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp" (OuterVolumeSpecName: "kube-api-access-wshhp") pod "e80d932d-c251-44f6-8456-92ed13714c74" (UID: "e80d932d-c251-44f6-8456-92ed13714c74"). InnerVolumeSpecName "kube-api-access-wshhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:28.016778 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.016748 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wshhp\" (UniqueName: \"kubernetes.io/projected/e80d932d-c251-44f6-8456-92ed13714c74-kube-api-access-wshhp\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:05:28.607864 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.607827 2579 generic.go:358] "Generic (PLEG): container finished" podID="e80d932d-c251-44f6-8456-92ed13714c74" containerID="440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697" exitCode=0 Apr 16 15:05:28.608249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.607878 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tgh29" Apr 16 15:05:28.608249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.607888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tgh29" event={"ID":"e80d932d-c251-44f6-8456-92ed13714c74","Type":"ContainerDied","Data":"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697"} Apr 16 15:05:28.608249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.607934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tgh29" event={"ID":"e80d932d-c251-44f6-8456-92ed13714c74","Type":"ContainerDied","Data":"a972f80b0d62bbf00276b0a79bed8160e343a082775ffbc53ca5b1e3218080c9"} Apr 16 15:05:28.608249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.607954 2579 scope.go:117] "RemoveContainer" containerID="440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697" Apr 16 15:05:28.615285 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.615056 2579 scope.go:117] "RemoveContainer" containerID="440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697" Apr 16 15:05:28.615349 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:05:28.615314 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697\": container with ID starting with 440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697 not found: ID does not exist" containerID="440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697" Apr 16 15:05:28.615401 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.615345 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697"} err="failed to get container status \"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697\": rpc error: code = NotFound desc = could not find container \"440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697\": container with ID starting with 440f1159edab9eede87673b97cbcfb410e0ddf7a2b0a52733ffc7573514b9697 not found: ID does not exist" Apr 16 15:05:28.626158 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.626134 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:28.628499 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:28.628482 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tgh29"] Apr 16 15:05:29.354817 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:05:29.354784 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80d932d-c251-44f6-8456-92ed13714c74" path="/var/lib/kubelet/pods/e80d932d-c251-44f6-8456-92ed13714c74/volumes" Apr 16 15:06:54.722026 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.721990 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-48t7w"] Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722229 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e80d932d-c251-44f6-8456-92ed13714c74" containerName="authorino" Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722238 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80d932d-c251-44f6-8456-92ed13714c74" containerName="authorino" Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722255 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" containerName="authorino" Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722262 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" containerName="authorino" Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722303 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e80d932d-c251-44f6-8456-92ed13714c74" containerName="authorino" Apr 16 15:06:54.722483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.722313 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2157d8f8-dfad-464b-93ce-4c9a59f083a2" containerName="authorino" Apr 16 15:06:54.724877 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.724860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.727311 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.727285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:06:54.727443 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.727295 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-zc84t\"" Apr 16 15:06:54.727443 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.727330 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:06:54.727443 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.727319 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 15:06:54.732586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.732562 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-48t7w"] Apr 16 15:06:54.850586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.850551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4af35d6f-d455-4a38-9e0d-561821b8e721-cert\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.850774 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.850642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdqs\" (UniqueName: \"kubernetes.io/projected/4af35d6f-d455-4a38-9e0d-561821b8e721-kube-api-access-qgdqs\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.951415 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.951383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4af35d6f-d455-4a38-9e0d-561821b8e721-cert\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.951618 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.951449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdqs\" (UniqueName: \"kubernetes.io/projected/4af35d6f-d455-4a38-9e0d-561821b8e721-kube-api-access-qgdqs\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.953859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.953831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4af35d6f-d455-4a38-9e0d-561821b8e721-cert\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:54.966729 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:54.966695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdqs\" (UniqueName: \"kubernetes.io/projected/4af35d6f-d455-4a38-9e0d-561821b8e721-kube-api-access-qgdqs\") pod \"odh-model-controller-696fc77849-48t7w\" (UID: \"4af35d6f-d455-4a38-9e0d-561821b8e721\") " pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:55.035784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:55.035752 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:55.152161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:55.152131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-48t7w"] Apr 16 15:06:55.155524 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:06:55.155498 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af35d6f_d455_4a38_9e0d_561821b8e721.slice/crio-504d6fd58cd92b38022259f663a221db9d03687b09f857d21a512500882b0c9f WatchSource:0}: Error finding container 504d6fd58cd92b38022259f663a221db9d03687b09f857d21a512500882b0c9f: Status 404 returned error can't find the container with id 504d6fd58cd92b38022259f663a221db9d03687b09f857d21a512500882b0c9f Apr 16 15:06:55.871538 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:55.871500 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-48t7w" event={"ID":"4af35d6f-d455-4a38-9e0d-561821b8e721","Type":"ContainerStarted","Data":"504d6fd58cd92b38022259f663a221db9d03687b09f857d21a512500882b0c9f"} Apr 16 15:06:58.882336 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:58.882302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-48t7w" event={"ID":"4af35d6f-d455-4a38-9e0d-561821b8e721","Type":"ContainerStarted","Data":"7d4dd9301c0e3370d1b808092082c841a46aa87c9e144e3b1a826801241e03bc"} Apr 16 15:06:58.882712 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:58.882361 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:06:58.898355 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:06:58.898308 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-48t7w" podStartSLOduration=2.166270363 podStartE2EDuration="4.898293522s" podCreationTimestamp="2026-04-16 15:06:54 +0000 UTC" firstStartedPulling="2026-04-16 15:06:55.157170338 +0000 UTC m=+864.339708090" lastFinishedPulling="2026-04-16 15:06:57.889193497 +0000 UTC m=+867.071731249" observedRunningTime="2026-04-16 15:06:58.896991081 +0000 UTC m=+868.079528885" watchObservedRunningTime="2026-04-16 15:06:58.898293522 +0000 UTC m=+868.080831296" Apr 16 15:07:09.886939 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:09.886883 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-48t7w" Apr 16 15:07:10.662112 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.662076 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-j9trf"] Apr 16 15:07:10.664859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.664842 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-j9trf" Apr 16 15:07:10.667280 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.667259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c65rw\"" Apr 16 15:07:10.667394 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.667261 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:07:10.671295 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.671268 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-j9trf"] Apr 16 15:07:10.756375 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.756333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfbl\" (UniqueName: \"kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl\") pod \"s3-init-j9trf\" (UID: \"ccba7863-bc8c-4937-b1f7-fd72d717b3f6\") " pod="kserve/s3-init-j9trf" Apr 16 15:07:10.857728 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.857687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfbl\" (UniqueName: \"kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl\") pod \"s3-init-j9trf\" (UID: \"ccba7863-bc8c-4937-b1f7-fd72d717b3f6\") " pod="kserve/s3-init-j9trf" Apr 16 15:07:10.866399 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.866368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfbl\" (UniqueName: \"kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl\") pod \"s3-init-j9trf\" (UID: \"ccba7863-bc8c-4937-b1f7-fd72d717b3f6\") " pod="kserve/s3-init-j9trf" Apr 16 15:07:10.974215 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:10.974132 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-j9trf" Apr 16 15:07:11.092265 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:11.092231 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-j9trf"] Apr 16 15:07:11.095395 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:07:11.095366 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccba7863_bc8c_4937_b1f7_fd72d717b3f6.slice/crio-c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e WatchSource:0}: Error finding container c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e: Status 404 returned error can't find the container with id c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e Apr 16 15:07:11.922770 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:11.922727 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-j9trf" event={"ID":"ccba7863-bc8c-4937-b1f7-fd72d717b3f6","Type":"ContainerStarted","Data":"c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e"} Apr 16 15:07:16.939754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:16.939714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-j9trf" event={"ID":"ccba7863-bc8c-4937-b1f7-fd72d717b3f6","Type":"ContainerStarted","Data":"3250d0c6bffedda9b88f0d7829a79964d3e6a9cb76a217d1de15f5d406ad0841"} Apr 16 15:07:16.955239 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:16.955178 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-j9trf" podStartSLOduration=1.9410640099999998 podStartE2EDuration="6.955157937s" podCreationTimestamp="2026-04-16 15:07:10 +0000 UTC" firstStartedPulling="2026-04-16 15:07:11.097119169 +0000 UTC m=+880.279656920" lastFinishedPulling="2026-04-16 15:07:16.111213085 +0000 UTC m=+885.293750847" observedRunningTime="2026-04-16 15:07:16.9543007 +0000 UTC m=+886.136838473" watchObservedRunningTime="2026-04-16 15:07:16.955157937 +0000 UTC m=+886.137695716" Apr 16 15:07:19.948726 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:19.948692 2579 generic.go:358] "Generic (PLEG): container finished" podID="ccba7863-bc8c-4937-b1f7-fd72d717b3f6" containerID="3250d0c6bffedda9b88f0d7829a79964d3e6a9cb76a217d1de15f5d406ad0841" exitCode=0 Apr 16 15:07:19.949109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:19.948747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-j9trf" event={"ID":"ccba7863-bc8c-4937-b1f7-fd72d717b3f6","Type":"ContainerDied","Data":"3250d0c6bffedda9b88f0d7829a79964d3e6a9cb76a217d1de15f5d406ad0841"} Apr 16 15:07:21.076078 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.076054 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-j9trf" Apr 16 15:07:21.132633 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.132595 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfbl\" (UniqueName: \"kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl\") pod \"ccba7863-bc8c-4937-b1f7-fd72d717b3f6\" (UID: \"ccba7863-bc8c-4937-b1f7-fd72d717b3f6\") " Apr 16 15:07:21.134784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.134758 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl" (OuterVolumeSpecName: "kube-api-access-4vfbl") pod "ccba7863-bc8c-4937-b1f7-fd72d717b3f6" (UID: "ccba7863-bc8c-4937-b1f7-fd72d717b3f6"). InnerVolumeSpecName "kube-api-access-4vfbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:07:21.233698 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.233612 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vfbl\" (UniqueName: \"kubernetes.io/projected/ccba7863-bc8c-4937-b1f7-fd72d717b3f6-kube-api-access-4vfbl\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:07:21.955545 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.955505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-j9trf" event={"ID":"ccba7863-bc8c-4937-b1f7-fd72d717b3f6","Type":"ContainerDied","Data":"c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e"} Apr 16 15:07:21.955545 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.955543 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c044408481e0d45950598bb264ef352a4ae6bb177a7318fab5926e70ae0d140e" Apr 16 15:07:21.955753 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:21.955554 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-j9trf" Apr 16 15:07:31.683552 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.683518 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6"] Apr 16 15:07:31.684077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.683798 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccba7863-bc8c-4937-b1f7-fd72d717b3f6" containerName="s3-init" Apr 16 15:07:31.684077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.683809 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccba7863-bc8c-4937-b1f7-fd72d717b3f6" containerName="s3-init" Apr 16 15:07:31.684077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.683864 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccba7863-bc8c-4937-b1f7-fd72d717b3f6" containerName="s3-init" Apr 16 15:07:31.717932 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.717862 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6"] Apr 16 15:07:31.718102 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.718071 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.720824 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.720791 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 15:07:31.720824 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.720815 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:07:31.721029 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.720940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:07:31.721029 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.720982 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-b8kdd\"" Apr 16 15:07:31.809631 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809631 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpbrs\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-kube-api-access-zpbrs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809831 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809669 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809831 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809726 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809831 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809831 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.809831 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.810019 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809848 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.810019 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.809864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpbrs\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-kube-api-access-zpbrs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911652 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911652 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911652 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911652 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911618 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.911888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.911776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.912104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.912082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.912278 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.912253 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.912410 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.912391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.913868 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.913846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.914161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.914145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.919025 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.919003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpbrs\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-kube-api-access-zpbrs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:31.919133 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:31.919006 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b6be884-0cea-4117-b0fb-ecbaae0e7486-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-x74z6\" (UID: \"6b6be884-0cea-4117-b0fb-ecbaae0e7486\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:32.028960 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:32.028914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:32.164851 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:32.164825 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6"] Apr 16 15:07:32.167263 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:07:32.167223 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6be884_0cea_4117_b0fb_ecbaae0e7486.slice/crio-e320e796a29c3e184dbb5f707c973096ce852dcfedf380703513327e4ab722fe WatchSource:0}: Error finding container e320e796a29c3e184dbb5f707c973096ce852dcfedf380703513327e4ab722fe: Status 404 returned error can't find the container with id e320e796a29c3e184dbb5f707c973096ce852dcfedf380703513327e4ab722fe Apr 16 15:07:32.992249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:32.992215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" event={"ID":"6b6be884-0cea-4117-b0fb-ecbaae0e7486","Type":"ContainerStarted","Data":"e320e796a29c3e184dbb5f707c973096ce852dcfedf380703513327e4ab722fe"} Apr 16 15:07:36.876893 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:36.876853 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 15:07:36.877217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:36.876949 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 15:07:36.877217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:36.876983 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 15:07:37.005586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:37.005548 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" event={"ID":"6b6be884-0cea-4117-b0fb-ecbaae0e7486","Type":"ContainerStarted","Data":"a8808ce456903a5476df91fb8c3f75e8f084f0c99cffc32fcdeab5380e6e0bbd"} Apr 16 15:07:37.025694 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:37.025648 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" podStartSLOduration=1.318233159 podStartE2EDuration="6.025625711s" podCreationTimestamp="2026-04-16 15:07:31 +0000 UTC" firstStartedPulling="2026-04-16 15:07:32.169263978 +0000 UTC m=+901.351801735" lastFinishedPulling="2026-04-16 15:07:36.876656521 +0000 UTC m=+906.059194287" observedRunningTime="2026-04-16 15:07:37.02361688 +0000 UTC m=+906.206154653" watchObservedRunningTime="2026-04-16 15:07:37.025625711 +0000 UTC m=+906.208163482" Apr 16 15:07:37.029684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:37.029643 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:37.031014 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:37.030985 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" podUID="6b6be884-0cea-4117-b0fb-ecbaae0e7486" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.23:15021/healthz/ready\": dial tcp 10.132.0.23:15021: connect: connection refused" Apr 16 15:07:38.033928 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:38.033887 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:39.012247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:39.012217 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:07:39.013448 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:07:39.013423 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-x74z6" Apr 16 15:08:03.945587 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:03.945551 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:03.949359 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:03.949340 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:03.952869 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:03.952845 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 15:08:03.952869 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:03.952858 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:08:03.958945 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:03.958917 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:04.078742 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.078942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.078942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078804 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.078942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qq6\" (UniqueName: \"kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.078942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.079126 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.078971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180309 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qq6\" (UniqueName: \"kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.180493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.180413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.181123 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.181093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.181349 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.181325 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.181438 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.181302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.187239 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.187215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.187451 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.187433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.189780 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.189753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qq6\" (UniqueName: \"kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.261225 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.261193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:04.390561 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.390526 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:04.393585 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:08:04.393557 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbb7ee16_5fd6_41a4_8ad1_b874dd5f048a.slice/crio-5ed29dd326fa98b9ac09c1edbfe25097eff533e8b7cefe3dc3f1f912a1a3f579 WatchSource:0}: Error finding container 5ed29dd326fa98b9ac09c1edbfe25097eff533e8b7cefe3dc3f1f912a1a3f579: Status 404 returned error can't find the container with id 5ed29dd326fa98b9ac09c1edbfe25097eff533e8b7cefe3dc3f1f912a1a3f579 Apr 16 15:08:04.395753 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:04.395734 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:08:05.097644 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:05.097594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerStarted","Data":"5ed29dd326fa98b9ac09c1edbfe25097eff533e8b7cefe3dc3f1f912a1a3f579"} Apr 16 15:08:09.116879 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:09.116840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerStarted","Data":"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd"} Apr 16 15:08:13.132006 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:13.131967 2579 generic.go:358] "Generic (PLEG): container finished" podID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerID="05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd" exitCode=0 Apr 16 15:08:13.132383 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:13.132019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerDied","Data":"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd"} Apr 16 15:08:15.139754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:15.139719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerStarted","Data":"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd"} Apr 16 15:08:15.156597 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:15.156540 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" podStartSLOduration=2.208191467 podStartE2EDuration="12.156522148s" podCreationTimestamp="2026-04-16 15:08:03 +0000 UTC" firstStartedPulling="2026-04-16 15:08:04.395862299 +0000 UTC m=+933.578400051" lastFinishedPulling="2026-04-16 15:08:14.344192966 +0000 UTC m=+943.526730732" observedRunningTime="2026-04-16 15:08:15.155254337 +0000 UTC m=+944.337792124" watchObservedRunningTime="2026-04-16 15:08:15.156522148 +0000 UTC m=+944.339059923" Apr 16 15:08:24.261541 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:24.261499 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:24.261541 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:24.261545 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:24.273920 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:24.273878 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:25.180749 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:25.180720 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:37.887651 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:37.887619 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:37.888046 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:37.887891 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="main" containerID="cri-o://088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd" gracePeriod=30 Apr 16 15:08:38.158753 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.158729 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:38.211415 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.211376 2579 generic.go:358] "Generic (PLEG): container finished" podID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerID="088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd" exitCode=0 Apr 16 15:08:38.211588 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.211454 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" Apr 16 15:08:38.211588 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.211465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerDied","Data":"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd"} Apr 16 15:08:38.211588 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.211506 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j" event={"ID":"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a","Type":"ContainerDied","Data":"5ed29dd326fa98b9ac09c1edbfe25097eff533e8b7cefe3dc3f1f912a1a3f579"} Apr 16 15:08:38.211588 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.211526 2579 scope.go:117] "RemoveContainer" containerID="088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd" Apr 16 15:08:38.219114 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.219089 2579 scope.go:117] "RemoveContainer" containerID="05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd" Apr 16 15:08:38.257634 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257602 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.257813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257668 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.257813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257696 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.257813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257717 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.257813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257737 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.257813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257760 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qq6\" (UniqueName: \"kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6\") pod \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\" (UID: \"bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a\") " Apr 16 15:08:38.258080 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.257932 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache" (OuterVolumeSpecName: "model-cache") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:38.258318 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.258286 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home" (OuterVolumeSpecName: "home") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:38.259932 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.259877 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm" (OuterVolumeSpecName: "dshm") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:38.260258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.260234 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6" (OuterVolumeSpecName: "kube-api-access-q9qq6") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "kube-api-access-q9qq6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:08:38.260525 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.260505 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:08:38.287526 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.287487 2579 scope.go:117] "RemoveContainer" containerID="088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd" Apr 16 15:08:38.287873 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:08:38.287851 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd\": container with ID starting with 088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd not found: ID does not exist" containerID="088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd" Apr 16 15:08:38.288037 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.287889 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd"} err="failed to get container status \"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd\": rpc error: code = NotFound desc = could not find container \"088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd\": container with ID starting with 088f30f166b13251e499f2e32d0fe8f36022fc50a6b3e01282d06eb9bf433dfd not found: ID does not exist" Apr 16 15:08:38.288135 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.288042 2579 scope.go:117] "RemoveContainer" containerID="05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd" Apr 16 15:08:38.288344 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:08:38.288321 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd\": container with ID starting with 05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd not found: ID does not exist" containerID="05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd" Apr 16 15:08:38.288412 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.288352 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd"} err="failed to get container status \"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd\": rpc error: code = NotFound desc = could not find container \"05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd\": container with ID starting with 05b7eb74a2a9e89392b59c573bd615ba5aa64e54241ccc1de76308ac11568ffd not found: ID does not exist" Apr 16 15:08:38.322202 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.322133 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" (UID: "bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:38.358581 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358548 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.358581 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358581 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.358708 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358596 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.358708 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358605 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.358708 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358614 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9qq6\" (UniqueName: \"kubernetes.io/projected/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-kube-api-access-q9qq6\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.358708 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.358623 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:08:38.532827 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.532790 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:38.536674 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:38.536640 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56f5cfcbc8dx46j"] Apr 16 15:08:39.354941 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:39.354891 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" path="/var/lib/kubelet/pods/bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a/volumes" Apr 16 15:08:54.547265 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547172 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:08:54.547630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547498 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="main" Apr 16 15:08:54.547630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547511 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="main" Apr 16 15:08:54.547630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547530 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="storage-initializer" Apr 16 15:08:54.547630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547535 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="storage-initializer" Apr 16 15:08:54.547630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.547581 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbb7ee16-5fd6-41a4-8ad1-b874dd5f048a" containerName="main" Apr 16 15:08:54.550474 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.550456 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.552976 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.552954 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 15:08:54.553072 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.552974 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:08:54.560252 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.560226 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:08:54.587962 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.587924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9g46\" (UniqueName: \"kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.588128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.587998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.588128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.588018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.588128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.588035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.588128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.588065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.588257 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.588152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689350 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9g46\" (UniqueName: \"kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689810 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689869 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.689869 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.689851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.691695 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.691671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.691889 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.691873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.699975 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.699950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9g46\" (UniqueName: \"kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.861835 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.861747 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:08:54.985060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:54.985026 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:08:54.988795 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:08:54.988769 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160a4a04_d5e6_42fb_9939_141fbc367e66.slice/crio-ad0f28e5c4a188b73345ef16a64d8bc7a56aef232287a41a9da3f1851ae7dd67 WatchSource:0}: Error finding container ad0f28e5c4a188b73345ef16a64d8bc7a56aef232287a41a9da3f1851ae7dd67: Status 404 returned error can't find the container with id ad0f28e5c4a188b73345ef16a64d8bc7a56aef232287a41a9da3f1851ae7dd67 Apr 16 15:08:55.264665 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:55.264624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerStarted","Data":"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9"} Apr 16 15:08:55.264864 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:55.264673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerStarted","Data":"ad0f28e5c4a188b73345ef16a64d8bc7a56aef232287a41a9da3f1851ae7dd67"} Apr 16 15:08:59.280536 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:59.280498 2579 generic.go:358] "Generic (PLEG): container finished" podID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerID="7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9" exitCode=0 Apr 16 15:08:59.281037 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:08:59.280550 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerDied","Data":"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9"} Apr 16 15:09:25.491497 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.491424 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:09:25.582623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.582585 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:09:25.582805 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.582738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.585418 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.585398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 15:09:25.585545 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.585496 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-9h4wn\"" Apr 16 15:09:25.765490 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765448 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.765673 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.765673 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.765673 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj62t\" (UniqueName: \"kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.765798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.765798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.765723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866563 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866563 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866790 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866790 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866790 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj62t\" (UniqueName: \"kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.866790 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.867299 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.867299 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.866983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.867299 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.867017 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.867299 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.867103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.869124 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.869101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.874369 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.874339 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj62t\" (UniqueName: \"kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:25.892561 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:25.892470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:09:26.068217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:26.068191 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:09:26.070601 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:09:26.070573 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475e84b0_2f08_4fc2_a54f_ff9d470340e2.slice/crio-a4842cc0eec6dc66ec7bf72082ef7b399059ca518a146d250765403e0bcc0a28 WatchSource:0}: Error finding container a4842cc0eec6dc66ec7bf72082ef7b399059ca518a146d250765403e0bcc0a28: Status 404 returned error can't find the container with id a4842cc0eec6dc66ec7bf72082ef7b399059ca518a146d250765403e0bcc0a28 Apr 16 15:09:26.391327 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:26.391232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerStarted","Data":"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0"} Apr 16 15:09:26.393744 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:26.393703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerStarted","Data":"d2556d967b343017a149e33496f1370bcb45bb1172c1db42f38e569a3e2d1a01"} Apr 16 15:09:26.393744 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:26.393748 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerStarted","Data":"a4842cc0eec6dc66ec7bf72082ef7b399059ca518a146d250765403e0bcc0a28"} Apr 16 15:09:26.413799 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:26.413742 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podStartSLOduration=5.659799994 podStartE2EDuration="32.413724036s" podCreationTimestamp="2026-04-16 15:08:54 +0000 UTC" firstStartedPulling="2026-04-16 15:08:59.281767282 +0000 UTC m=+988.464305034" lastFinishedPulling="2026-04-16 15:09:26.035691319 +0000 UTC m=+1015.218229076" observedRunningTime="2026-04-16 15:09:26.412872884 +0000 UTC m=+1015.595410659" watchObservedRunningTime="2026-04-16 15:09:26.413724036 +0000 UTC m=+1015.596261813" Apr 16 15:09:27.399518 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:27.399485 2579 generic.go:358] "Generic (PLEG): container finished" podID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerID="d2556d967b343017a149e33496f1370bcb45bb1172c1db42f38e569a3e2d1a01" exitCode=0 Apr 16 15:09:27.399997 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:27.399572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerDied","Data":"d2556d967b343017a149e33496f1370bcb45bb1172c1db42f38e569a3e2d1a01"} Apr 16 15:09:29.407204 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:29.407168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerStarted","Data":"5d39f449ffa2306d09a66fd0fc4a03f8b09db6ff641eded980cdc4cc4157222f"} Apr 16 15:09:34.862540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:34.862489 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:09:34.862540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:34.862550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:09:34.863927 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:34.863866 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:09:44.862756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:44.862703 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:09:54.863049 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:54.862996 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:09:58.723898 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:09:58.723834 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:10:02.528150 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.528115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerStarted","Data":"d32733dee6defa0a57147127c5f161daef10337af0d1da89d6b0047be1922d02"} Apr 16 15:10:02.528555 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.528264 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" containerID="cri-o://5d39f449ffa2306d09a66fd0fc4a03f8b09db6ff641eded980cdc4cc4157222f" gracePeriod=30 Apr 16 15:10:02.528555 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.528291 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="tokenizer" containerID="cri-o://d32733dee6defa0a57147127c5f161daef10337af0d1da89d6b0047be1922d02" gracePeriod=30 Apr 16 15:10:02.528555 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.528359 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:10:02.531609 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.531570 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 15:10:02.549672 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:02.549625 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podStartSLOduration=2.489683733 podStartE2EDuration="37.549607396s" podCreationTimestamp="2026-04-16 15:09:25 +0000 UTC" firstStartedPulling="2026-04-16 15:09:27.40071907 +0000 UTC m=+1016.583256821" lastFinishedPulling="2026-04-16 15:10:02.460642729 +0000 UTC m=+1051.643180484" observedRunningTime="2026-04-16 15:10:02.547788145 +0000 UTC m=+1051.730325918" watchObservedRunningTime="2026-04-16 15:10:02.549607396 +0000 UTC m=+1051.732145171" Apr 16 15:10:03.534778 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:03.534736 2579 generic.go:358] "Generic (PLEG): container finished" podID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerID="5d39f449ffa2306d09a66fd0fc4a03f8b09db6ff641eded980cdc4cc4157222f" exitCode=0 Apr 16 15:10:03.535161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:03.534812 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerDied","Data":"5d39f449ffa2306d09a66fd0fc4a03f8b09db6ff641eded980cdc4cc4157222f"} Apr 16 15:10:04.863192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:04.863152 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:05.892577 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:05.892540 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:10:12.269408 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.269375 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:12.559049 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:10:12.558956 2579 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 16 15:10:12.651821 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.651780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:12.652010 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.651991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.654798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.654775 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 15:10:12.690989 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.690955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.691166 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.690997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.691166 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.691038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.691166 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.691072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b75qf\" (UniqueName: \"kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.691166 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.691129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.691363 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.691199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792040 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.791998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792247 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75qf\" (UniqueName: \"kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792520 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792520 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.792743 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.792724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.794579 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.794552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.795118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.795086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.799841 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.799816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b75qf\" (UniqueName: \"kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf\") pod \"precise-prefix-cache-test-kserve-57947565d-qqmr8\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:12.961837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:12.961754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:13.092693 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:13.092666 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:13.095973 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:10:13.095940 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a22ebb5_7082_4370_890c_c83386a52422.slice/crio-c36ad7ed95d6611066da39b2a6d26441bb7fc667b184322ce1c6f11f6944005e WatchSource:0}: Error finding container c36ad7ed95d6611066da39b2a6d26441bb7fc667b184322ce1c6f11f6944005e: Status 404 returned error can't find the container with id c36ad7ed95d6611066da39b2a6d26441bb7fc667b184322ce1c6f11f6944005e Apr 16 15:10:13.530104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:13.530045 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 16 15:10:13.569335 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:13.569292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerStarted","Data":"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e"} Apr 16 15:10:13.569335 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:13.569338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerStarted","Data":"c36ad7ed95d6611066da39b2a6d26441bb7fc667b184322ce1c6f11f6944005e"} Apr 16 15:10:14.862365 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:14.862322 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:17.585629 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:17.585597 2579 generic.go:358] "Generic (PLEG): container finished" podID="0a22ebb5-7082-4370-890c-c83386a52422" containerID="ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e" exitCode=0 Apr 16 15:10:17.586019 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:17.585670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerDied","Data":"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e"} Apr 16 15:10:18.590959 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:18.590924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerStarted","Data":"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83"} Apr 16 15:10:18.609739 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:18.609700 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" podStartSLOduration=6.609685463 podStartE2EDuration="6.609685463s" podCreationTimestamp="2026-04-16 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:10:18.607505187 +0000 UTC m=+1067.790042958" watchObservedRunningTime="2026-04-16 15:10:18.609685463 +0000 UTC m=+1067.792223237" Apr 16 15:10:22.528934 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:10:22.528835 2579 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 16 15:10:22.962659 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:22.962559 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:22.962659 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:22.962609 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:22.975027 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:22.974994 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:23.528536 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:23.528483 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 16 15:10:23.618782 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:23.618757 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:24.863176 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:24.863135 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:32.529577 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:10:32.529549 2579 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 16 15:10:32.643281 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.643254 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62_475e84b0-2f08-4fc2-a54f-ff9d470340e2/tokenizer/0.log" Apr 16 15:10:32.643997 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.643972 2579 generic.go:358] "Generic (PLEG): container finished" podID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerID="d32733dee6defa0a57147127c5f161daef10337af0d1da89d6b0047be1922d02" exitCode=137 Apr 16 15:10:32.644072 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.644009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerDied","Data":"d32733dee6defa0a57147127c5f161daef10337af0d1da89d6b0047be1922d02"} Apr 16 15:10:32.855389 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.855368 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62_475e84b0-2f08-4fc2-a54f-ff9d470340e2/tokenizer/0.log" Apr 16 15:10:32.856104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.856085 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:10:32.979412 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979384 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979412 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979420 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979638 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979436 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979638 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979461 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979638 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979520 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj62t\" (UniqueName: \"kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979638 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache\") pod \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\" (UID: \"475e84b0-2f08-4fc2-a54f-ff9d470340e2\") " Apr 16 15:10:32.979884 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979837 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:32.979977 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.979923 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:32.980033 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.980010 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:32.980447 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.980422 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:32.981813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.981789 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t" (OuterVolumeSpecName: "kube-api-access-qj62t") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "kube-api-access-qj62t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:10:32.982135 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:32.982119 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "475e84b0-2f08-4fc2-a54f-ff9d470340e2" (UID: "475e84b0-2f08-4fc2-a54f-ff9d470340e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:10:33.080593 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080560 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-tmp\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.080593 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080590 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.080593 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080601 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-uds\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.080821 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080611 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.080821 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080621 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qj62t\" (UniqueName: \"kubernetes.io/projected/475e84b0-2f08-4fc2-a54f-ff9d470340e2-kube-api-access-qj62t\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.080821 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.080630 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/475e84b0-2f08-4fc2-a54f-ff9d470340e2-tokenizer-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:33.529933 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.529853 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 16 15:10:33.649500 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.649411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62_475e84b0-2f08-4fc2-a54f-ff9d470340e2/tokenizer/0.log" Apr 16 15:10:33.650185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.650152 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" event={"ID":"475e84b0-2f08-4fc2-a54f-ff9d470340e2","Type":"ContainerDied","Data":"a4842cc0eec6dc66ec7bf72082ef7b399059ca518a146d250765403e0bcc0a28"} Apr 16 15:10:33.650323 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.650195 2579 scope.go:117] "RemoveContainer" containerID="d32733dee6defa0a57147127c5f161daef10337af0d1da89d6b0047be1922d02" Apr 16 15:10:33.650323 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.650203 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62" Apr 16 15:10:33.658657 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.658633 2579 scope.go:117] "RemoveContainer" containerID="5d39f449ffa2306d09a66fd0fc4a03f8b09db6ff641eded980cdc4cc4157222f" Apr 16 15:10:33.668232 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.667790 2579 scope.go:117] "RemoveContainer" containerID="d2556d967b343017a149e33496f1370bcb45bb1172c1db42f38e569a3e2d1a01" Apr 16 15:10:33.668232 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.668017 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:10:33.672124 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:33.672095 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7nxt62"] Apr 16 15:10:34.863195 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:34.863142 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:35.354322 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:35.354278 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" path="/var/lib/kubelet/pods/475e84b0-2f08-4fc2-a54f-ff9d470340e2/volumes" Apr 16 15:10:44.862595 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:44.862539 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:54.862225 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:54.862177 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:10:57.891731 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:57.891697 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:57.892231 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:57.892087 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="main" containerID="cri-o://155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83" gracePeriod=30 Apr 16 15:10:58.156963 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.156934 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:58.201040 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201000 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201250 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201105 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201250 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201135 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b75qf\" (UniqueName: \"kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201250 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201199 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201250 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201239 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201455 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201269 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs\") pod \"0a22ebb5-7082-4370-890c-c83386a52422\" (UID: \"0a22ebb5-7082-4370-890c-c83386a52422\") " Apr 16 15:10:58.201455 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201373 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache" (OuterVolumeSpecName: "model-cache") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.201547 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201514 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.201755 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.201718 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home" (OuterVolumeSpecName: "home") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.203636 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.203593 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf" (OuterVolumeSpecName: "kube-api-access-b75qf") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "kube-api-access-b75qf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:10:58.204015 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.203990 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm" (OuterVolumeSpecName: "dshm") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.204144 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.204124 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:10:58.260132 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.260089 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a22ebb5-7082-4370-890c-c83386a52422" (UID: "0a22ebb5-7082-4370-890c-c83386a52422"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.302752 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.302705 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.302752 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.302743 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b75qf\" (UniqueName: \"kubernetes.io/projected/0a22ebb5-7082-4370-890c-c83386a52422-kube-api-access-b75qf\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.302752 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.302758 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.303013 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.302770 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a22ebb5-7082-4370-890c-c83386a52422-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.303013 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.302782 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a22ebb5-7082-4370-890c-c83386a52422-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.733883 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.733845 2579 generic.go:358] "Generic (PLEG): container finished" podID="0a22ebb5-7082-4370-890c-c83386a52422" containerID="155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83" exitCode=0 Apr 16 15:10:58.734118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.733896 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerDied","Data":"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83"} Apr 16 15:10:58.734118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.733950 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" Apr 16 15:10:58.734118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.733965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8" event={"ID":"0a22ebb5-7082-4370-890c-c83386a52422","Type":"ContainerDied","Data":"c36ad7ed95d6611066da39b2a6d26441bb7fc667b184322ce1c6f11f6944005e"} Apr 16 15:10:58.734118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.733988 2579 scope.go:117] "RemoveContainer" containerID="155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83" Apr 16 15:10:58.744483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.744456 2579 scope.go:117] "RemoveContainer" containerID="ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e" Apr 16 15:10:58.757132 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.757103 2579 scope.go:117] "RemoveContainer" containerID="155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83" Apr 16 15:10:58.757504 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:10:58.757476 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83\": container with ID starting with 155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83 not found: ID does not exist" containerID="155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83" Apr 16 15:10:58.757580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.757521 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83"} err="failed to get container status \"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83\": rpc error: code = NotFound desc = could not find container \"155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83\": container with ID starting with 155c0d846019ea69b271d64b557ca0a9f53f941a5519f6bab66c7c15e8949b83 not found: ID does not exist" Apr 16 15:10:58.757580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.757551 2579 scope.go:117] "RemoveContainer" containerID="ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e" Apr 16 15:10:58.758028 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:10:58.757951 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e\": container with ID starting with ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e not found: ID does not exist" containerID="ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e" Apr 16 15:10:58.758028 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.757992 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e"} err="failed to get container status \"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e\": rpc error: code = NotFound desc = could not find container \"ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e\": container with ID starting with ef52bef364a3fe2d31a4c1de6572ebbd7c88a8a0fdcfe49de762c0ccc34f807e not found: ID does not exist" Apr 16 15:10:58.758724 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.758699 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:58.760008 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:58.759987 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-qqmr8"] Apr 16 15:10:59.354813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:10:59.354782 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a22ebb5-7082-4370-890c-c83386a52422" path="/var/lib/kubelet/pods/0a22ebb5-7082-4370-890c-c83386a52422/volumes" Apr 16 15:11:04.862376 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:04.862325 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" probeResult="failure" output="Get \"https://10.132.0.25:8000/health\": dial tcp 10.132.0.25:8000: connect: connection refused" Apr 16 15:11:09.647930 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.647872 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648310 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648330 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648343 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="storage-initializer" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648351 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="storage-initializer" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648377 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="storage-initializer" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648385 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="storage-initializer" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648399 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="main" Apr 16 15:11:09.648417 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648407 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="main" Apr 16 15:11:09.648801 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648423 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="tokenizer" Apr 16 15:11:09.648801 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648431 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="tokenizer" Apr 16 15:11:09.648801 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648506 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="main" Apr 16 15:11:09.648801 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648519 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="475e84b0-2f08-4fc2-a54f-ff9d470340e2" containerName="tokenizer" Apr 16 15:11:09.648801 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.648532 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a22ebb5-7082-4370-890c-c83386a52422" containerName="main" Apr 16 15:11:09.654538 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.654513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.657026 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.657002 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 15:11:09.660886 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.660858 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:11:09.700101 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.700265 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.700265 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxj6s\" (UniqueName: \"kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.700265 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.700409 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.700409 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.700312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801403 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801657 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801618 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801698 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801745 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj6s\" (UniqueName: \"kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.801819 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.802025 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.801994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.802116 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.802052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.803609 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.803582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.804053 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.804031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.809185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.809157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxj6s\" (UniqueName: \"kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s\") pod \"stop-feature-test-kserve-5c65d4dbd9-hllt5\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.964478 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.964391 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:11:09.966043 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.966020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:09.968568 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.968548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:09.970859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.970836 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-9fmtl\"" Apr 16 15:11:09.979688 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:09.979644 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:11:10.094477 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.094451 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:11:10.097313 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:11:10.097285 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2c23bf_3ebe_44f9_ba09_6b430d533106.slice/crio-1d38a295b910aaa04d40a377269adcbf475675ad1d0ddbb0255832e2466c60e7 WatchSource:0}: Error finding container 1d38a295b910aaa04d40a377269adcbf475675ad1d0ddbb0255832e2466c60e7: Status 404 returned error can't find the container with id 1d38a295b910aaa04d40a377269adcbf475675ad1d0ddbb0255832e2466c60e7 Apr 16 15:11:10.104128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.104229 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhk4k\" (UniqueName: \"kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.104229 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104195 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.104229 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104226 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.104321 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.104321 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.104267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204674 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhk4k\" (UniqueName: \"kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.204862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.204834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.205157 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.205098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.205157 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.205113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.205256 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.205179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.205256 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.205213 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.207276 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.207256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.212345 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.212326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhk4k\" (UniqueName: \"kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.293321 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.293285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:10.429276 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.429237 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:11:10.432006 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:11:10.431975 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5710448_1bc6_4953_aa42_7ec2723315b9.slice/crio-0c2c469303bb8c605ec91d93a461138d13566030ae23278f659548775b75b35d WatchSource:0}: Error finding container 0c2c469303bb8c605ec91d93a461138d13566030ae23278f659548775b75b35d: Status 404 returned error can't find the container with id 0c2c469303bb8c605ec91d93a461138d13566030ae23278f659548775b75b35d Apr 16 15:11:10.774634 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.774592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerStarted","Data":"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880"} Apr 16 15:11:10.774634 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.774634 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerStarted","Data":"0c2c469303bb8c605ec91d93a461138d13566030ae23278f659548775b75b35d"} Apr 16 15:11:10.775888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.775854 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerStarted","Data":"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd"} Apr 16 15:11:10.775888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:10.775891 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerStarted","Data":"1d38a295b910aaa04d40a377269adcbf475675ad1d0ddbb0255832e2466c60e7"} Apr 16 15:11:11.779876 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:11.779828 2579 generic.go:358] "Generic (PLEG): container finished" podID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerID="9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880" exitCode=0 Apr 16 15:11:11.780355 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:11.779934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerDied","Data":"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880"} Apr 16 15:11:12.786130 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:12.786079 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerStarted","Data":"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777"} Apr 16 15:11:12.786130 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:12.786136 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerStarted","Data":"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df"} Apr 16 15:11:12.786647 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:12.786248 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:12.807291 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:12.807237 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" podStartSLOduration=3.807221103 podStartE2EDuration="3.807221103s" podCreationTimestamp="2026-04-16 15:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:12.805269885 +0000 UTC m=+1121.987807669" watchObservedRunningTime="2026-04-16 15:11:12.807221103 +0000 UTC m=+1121.989758877" Apr 16 15:11:14.794622 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:14.794519 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerID="43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd" exitCode=0 Apr 16 15:11:14.794622 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:14.794555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerDied","Data":"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd"} Apr 16 15:11:14.873298 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:14.873261 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:11:14.881224 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:14.881201 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:11:15.801612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:15.801567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerStarted","Data":"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5"} Apr 16 15:11:15.823528 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:15.823473 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podStartSLOduration=6.823452733 podStartE2EDuration="6.823452733s" podCreationTimestamp="2026-04-16 15:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:15.821115558 +0000 UTC m=+1125.003653338" watchObservedRunningTime="2026-04-16 15:11:15.823452733 +0000 UTC m=+1125.005990509" Apr 16 15:11:19.966714 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:19.966668 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:19.967212 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:19.966734 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:11:19.968212 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:19.968178 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:11:20.294340 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:20.294299 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:20.294540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:20.294455 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:20.297249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:20.297223 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:20.828431 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:20.828399 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:29.967343 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:29.967305 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:11:37.650467 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:37.650431 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:11:37.651001 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:37.650760 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" containerID="cri-o://f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0" gracePeriod=30 Apr 16 15:11:39.966643 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:39.966603 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:11:42.835679 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:42.835642 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:11:49.967371 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:49.967267 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:11:51.828476 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.828438 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:11:51.833938 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.833885 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.837160 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.837127 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 15:11:51.847189 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.847157 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:11:51.858825 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.858785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.859114 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.859089 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.859297 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.859277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwk7t\" (UniqueName: \"kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.859478 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.859460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.859699 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.859681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.859879 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.859865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.960813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.960813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961186 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwk7t\" (UniqueName: \"kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961186 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961186 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961186 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.960987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961423 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.961383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961506 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.961483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.961627 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.961588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.963698 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.963666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.963823 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.963762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:51.968695 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:51.968665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwk7t\" (UniqueName: \"kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t\") pod \"custom-route-timeout-test-kserve-c9cd6ff97-qnr8c\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:52.146685 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:52.146581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:11:52.276193 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:52.276156 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:11:52.279532 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:11:52.279494 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba877d1_55ac_4d58_817d_544546b8b982.slice/crio-6cd175df943021671c8ba0472118e4af902860b89bb0b02882fcf992e19225d7 WatchSource:0}: Error finding container 6cd175df943021671c8ba0472118e4af902860b89bb0b02882fcf992e19225d7: Status 404 returned error can't find the container with id 6cd175df943021671c8ba0472118e4af902860b89bb0b02882fcf992e19225d7 Apr 16 15:11:52.934582 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:52.934547 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerStarted","Data":"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101"} Apr 16 15:11:52.934582 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:52.934584 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerStarted","Data":"6cd175df943021671c8ba0472118e4af902860b89bb0b02882fcf992e19225d7"} Apr 16 15:11:56.954994 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:56.954958 2579 generic.go:358] "Generic (PLEG): container finished" podID="0ba877d1-55ac-4d58-817d-544546b8b982" containerID="6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101" exitCode=0 Apr 16 15:11:56.955402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:56.955040 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerDied","Data":"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101"} Apr 16 15:11:57.960129 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:57.960096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerStarted","Data":"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59"} Apr 16 15:11:57.982226 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:57.982166 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podStartSLOduration=6.982126788 podStartE2EDuration="6.982126788s" podCreationTimestamp="2026-04-16 15:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:57.979684755 +0000 UTC m=+1167.162222532" watchObservedRunningTime="2026-04-16 15:11:57.982126788 +0000 UTC m=+1167.164664565" Apr 16 15:11:59.966651 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:11:59.966604 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:12:02.147504 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:02.147455 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:12:02.147895 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:02.147517 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:12:02.149026 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:02.148995 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:12:07.942630 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.942600 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd_160a4a04-d5e6-42fb-9939-141fbc367e66/main/0.log" Apr 16 15:12:07.943088 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.943025 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:12:07.996181 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd_160a4a04-d5e6-42fb-9939-141fbc367e66/main/0.log" Apr 16 15:12:07.996526 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996502 2579 generic.go:358] "Generic (PLEG): container finished" podID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerID="f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0" exitCode=137 Apr 16 15:12:07.996617 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996579 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerDied","Data":"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0"} Apr 16 15:12:07.996617 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996587 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" Apr 16 15:12:07.996617 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996605 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd" event={"ID":"160a4a04-d5e6-42fb-9939-141fbc367e66","Type":"ContainerDied","Data":"ad0f28e5c4a188b73345ef16a64d8bc7a56aef232287a41a9da3f1851ae7dd67"} Apr 16 15:12:07.996617 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:07.996619 2579 scope.go:117] "RemoveContainer" containerID="f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0" Apr 16 15:12:08.000253 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000223 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000263 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000288 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9g46\" (UniqueName: \"kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000305 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000332 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000362 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm\") pod \"160a4a04-d5e6-42fb-9939-141fbc367e66\" (UID: \"160a4a04-d5e6-42fb-9939-141fbc367e66\") " Apr 16 15:12:08.000626 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000604 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home" (OuterVolumeSpecName: "home") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:12:08.000976 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.000939 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache" (OuterVolumeSpecName: "model-cache") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:12:08.003587 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.003546 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:12:08.003682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.003598 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm" (OuterVolumeSpecName: "dshm") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:12:08.003682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.003646 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46" (OuterVolumeSpecName: "kube-api-access-b9g46") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "kube-api-access-b9g46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:12:08.024583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.024557 2579 scope.go:117] "RemoveContainer" containerID="7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9" Apr 16 15:12:08.069003 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.068960 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "160a4a04-d5e6-42fb-9939-141fbc367e66" (UID: "160a4a04-d5e6-42fb-9939-141fbc367e66"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:12:08.095668 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.095643 2579 scope.go:117] "RemoveContainer" containerID="f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0" Apr 16 15:12:08.102291 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:12:08.096022 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0\": container with ID starting with f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0 not found: ID does not exist" containerID="f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0" Apr 16 15:12:08.102291 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.096522 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0"} err="failed to get container status \"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0\": rpc error: code = NotFound desc = could not find container \"f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0\": container with ID starting with f34125821ac7808d2cafba05aabc48e1250fec0c85eb12219e861f311a4d37f0 not found: ID does not exist" Apr 16 15:12:08.102291 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.096568 2579 scope.go:117] "RemoveContainer" containerID="7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9" Apr 16 15:12:08.102767 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:12:08.102737 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9\": container with ID starting with 7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9 not found: ID does not exist" containerID="7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9" Apr 16 15:12:08.102851 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.102782 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9"} err="failed to get container status \"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9\": rpc error: code = NotFound desc = could not find container \"7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9\": container with ID starting with 7566ee891ae890e44e6a50b0f316c2778768a3b27b5e5c3f5fb06747ac0934e9 not found: ID does not exist" Apr 16 15:12:08.103149 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103126 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.103193 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103173 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160a4a04-d5e6-42fb-9939-141fbc367e66-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.103193 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103186 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9g46\" (UniqueName: \"kubernetes.io/projected/160a4a04-d5e6-42fb-9939-141fbc367e66-kube-api-access-b9g46\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.103258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103197 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.103258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103206 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.103258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.103215 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/160a4a04-d5e6-42fb-9939-141fbc367e66-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:12:08.320784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.320745 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:12:08.322794 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:08.322763 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75c8dbd695nhfsd"] Apr 16 15:12:09.355084 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:09.355055 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" path="/var/lib/kubelet/pods/160a4a04-d5e6-42fb-9939-141fbc367e66/volumes" Apr 16 15:12:09.967230 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:09.967186 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:12:12.147554 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:12.147512 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:12:19.966627 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:19.966571 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:12:22.147178 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:22.147123 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:12:29.966583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:29.966533 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:12:32.148054 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:32.148000 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:12:39.967101 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:39.967055 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 15:12:42.147160 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:42.147104 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:12:49.976286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:49.976249 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:12:49.984736 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:49.984703 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:12:50.736538 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:50.736505 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:12:51.155057 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:51.155017 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" containerID="cri-o://607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5" gracePeriod=30 Apr 16 15:12:52.147191 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:12:52.147149 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:13:01.039477 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:01.039427 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:13:01.039930 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:01.039859 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="main" containerID="cri-o://e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df" gracePeriod=30 Apr 16 15:13:01.040000 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:01.039915 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="tokenizer" containerID="cri-o://141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777" gracePeriod=30 Apr 16 15:13:01.190973 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:01.190936 2579 generic.go:358] "Generic (PLEG): container finished" podID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerID="e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df" exitCode=0 Apr 16 15:13:01.191172 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:01.191004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerDied","Data":"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df"} Apr 16 15:13:02.147593 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.147536 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:13:02.495288 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.495260 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:13:02.600016 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.599969 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600053 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhk4k\" (UniqueName: \"kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600119 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600178 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600365 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600207 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600365 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600233 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs\") pod \"e5710448-1bc6-4953-aa42-7ec2723315b9\" (UID: \"e5710448-1bc6-4953-aa42-7ec2723315b9\") " Apr 16 15:13:02.600460 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600346 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:02.600460 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600375 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:02.600546 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:02.600546 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600508 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-tmp\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:02.600546 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600528 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-uds\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:02.600959 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.600922 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:02.602432 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.602407 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:13:02.602540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.602469 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k" (OuterVolumeSpecName: "kube-api-access-jhk4k") pod "e5710448-1bc6-4953-aa42-7ec2723315b9" (UID: "e5710448-1bc6-4953-aa42-7ec2723315b9"). InnerVolumeSpecName "kube-api-access-jhk4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:13:02.701194 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.701109 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-tokenizer-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:02.701194 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.701141 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5710448-1bc6-4953-aa42-7ec2723315b9-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:02.701194 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.701152 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5710448-1bc6-4953-aa42-7ec2723315b9-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:02.701194 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:02.701162 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhk4k\" (UniqueName: \"kubernetes.io/projected/e5710448-1bc6-4953-aa42-7ec2723315b9-kube-api-access-jhk4k\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:03.199329 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.199297 2579 generic.go:358] "Generic (PLEG): container finished" podID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerID="141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777" exitCode=0 Apr 16 15:13:03.199764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.199379 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" Apr 16 15:13:03.199764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.199386 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerDied","Data":"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777"} Apr 16 15:13:03.199764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.199419 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82" event={"ID":"e5710448-1bc6-4953-aa42-7ec2723315b9","Type":"ContainerDied","Data":"0c2c469303bb8c605ec91d93a461138d13566030ae23278f659548775b75b35d"} Apr 16 15:13:03.199764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.199434 2579 scope.go:117] "RemoveContainer" containerID="141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777" Apr 16 15:13:03.207842 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.207820 2579 scope.go:117] "RemoveContainer" containerID="e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df" Apr 16 15:13:03.215184 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.215162 2579 scope.go:117] "RemoveContainer" containerID="9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880" Apr 16 15:13:03.219799 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.219773 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:13:03.223285 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.223263 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-dlj82"] Apr 16 15:13:03.223612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.223595 2579 scope.go:117] "RemoveContainer" containerID="141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777" Apr 16 15:13:03.223884 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:13:03.223865 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777\": container with ID starting with 141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777 not found: ID does not exist" containerID="141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777" Apr 16 15:13:03.223991 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.223915 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777"} err="failed to get container status \"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777\": rpc error: code = NotFound desc = could not find container \"141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777\": container with ID starting with 141e463310f2be35b3844dfc82a6e7160ec8ce6a4854a01d1a4965f4870ee777 not found: ID does not exist" Apr 16 15:13:03.223991 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.223945 2579 scope.go:117] "RemoveContainer" containerID="e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df" Apr 16 15:13:03.224167 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:13:03.224153 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df\": container with ID starting with e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df not found: ID does not exist" containerID="e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df" Apr 16 15:13:03.224227 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.224175 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df"} err="failed to get container status \"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df\": rpc error: code = NotFound desc = could not find container \"e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df\": container with ID starting with e30b48cddbc85a38e83d3f410bca1fb48cc3018c4f13b449f345e47b86e148df not found: ID does not exist" Apr 16 15:13:03.224227 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.224198 2579 scope.go:117] "RemoveContainer" containerID="9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880" Apr 16 15:13:03.224419 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:13:03.224401 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880\": container with ID starting with 9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880 not found: ID does not exist" containerID="9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880" Apr 16 15:13:03.224472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.224427 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880"} err="failed to get container status \"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880\": rpc error: code = NotFound desc = could not find container \"9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880\": container with ID starting with 9410a18d448644762777dceb69dfe966a7aca949dbeb2389872766306f632880 not found: ID does not exist" Apr 16 15:13:03.354227 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:03.354184 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" path="/var/lib/kubelet/pods/e5710448-1bc6-4953-aa42-7ec2723315b9/volumes" Apr 16 15:13:12.147617 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:12.147557 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:13:15.148727 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.148696 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:13:15.149192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149146 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="storage-initializer" Apr 16 15:13:15.149192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149166 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="storage-initializer" Apr 16 15:13:15.149192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149180 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="storage-initializer" Apr 16 15:13:15.149192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149187 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="storage-initializer" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149198 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="tokenizer" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149206 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="tokenizer" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149218 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149227 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149239 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="main" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149248 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="main" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149317 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="160a4a04-d5e6-42fb-9939-141fbc367e66" containerName="main" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149332 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="tokenizer" Apr 16 15:13:15.149402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.149345 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5710448-1bc6-4953-aa42-7ec2723315b9" containerName="main" Apr 16 15:13:15.152971 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.152948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.163436 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.163407 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:13:15.205765 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.205942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ckw7\" (UniqueName: \"kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.205942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.205942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.206062 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205946 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.206062 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.205966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307134 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307134 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307367 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307367 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307476 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ckw7\" (UniqueName: \"kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307476 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.307587 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.308193 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.308166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.308392 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.307597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.315389 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.314777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.316032 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.316007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.316485 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.316459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ckw7\" (UniqueName: \"kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7\") pod \"stop-feature-test-kserve-5c65d4dbd9-h86zq\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.466003 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.465895 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:15.597130 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.597102 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:13:15.599412 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:13:15.599377 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce9d9d4_356a_488e_8f90_996b73151857.slice/crio-06e9651d87e9a28dd81d8bd0cc7c1d08359dca77134c8069491897a30d9ba632 WatchSource:0}: Error finding container 06e9651d87e9a28dd81d8bd0cc7c1d08359dca77134c8069491897a30d9ba632: Status 404 returned error can't find the container with id 06e9651d87e9a28dd81d8bd0cc7c1d08359dca77134c8069491897a30d9ba632 Apr 16 15:13:15.601886 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:15.601866 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:13:16.243671 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:16.243629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerStarted","Data":"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b"} Apr 16 15:13:16.243671 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:16.243668 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerStarted","Data":"06e9651d87e9a28dd81d8bd0cc7c1d08359dca77134c8069491897a30d9ba632"} Apr 16 15:13:20.260039 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:20.260005 2579 generic.go:358] "Generic (PLEG): container finished" podID="1ce9d9d4-356a-488e-8f90-996b73151857" containerID="5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b" exitCode=0 Apr 16 15:13:20.260395 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:20.260071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerDied","Data":"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b"} Apr 16 15:13:21.266288 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.266250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerStarted","Data":"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3"} Apr 16 15:13:21.298318 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.295194 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podStartSLOduration=6.295173753 podStartE2EDuration="6.295173753s" podCreationTimestamp="2026-04-16 15:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:13:21.295034285 +0000 UTC m=+1250.477572060" watchObservedRunningTime="2026-04-16 15:13:21.295173753 +0000 UTC m=+1250.477711528" Apr 16 15:13:21.458516 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.458487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5c65d4dbd9-hllt5_0b2c23bf-3ebe-44f9-ba09-6b430d533106/main/0.log" Apr 16 15:13:21.459005 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.458984 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:13:21.567493 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567460 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.567651 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567515 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.567651 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567567 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.567651 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567644 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.567828 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567673 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.567828 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.567726 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxj6s\" (UniqueName: \"kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s\") pod \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\" (UID: \"0b2c23bf-3ebe-44f9-ba09-6b430d533106\") " Apr 16 15:13:21.568165 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.568140 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache" (OuterVolumeSpecName: "model-cache") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:21.568294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.568160 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home" (OuterVolumeSpecName: "home") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:21.569757 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.569707 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm" (OuterVolumeSpecName: "dshm") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:21.570129 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.570103 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:13:21.570393 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.570359 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s" (OuterVolumeSpecName: "kube-api-access-wxj6s") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "kube-api-access-wxj6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:13:21.625263 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.625210 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b2c23bf-3ebe-44f9-ba09-6b430d533106" (UID: "0b2c23bf-3ebe-44f9-ba09-6b430d533106"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:21.669050 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669014 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:21.669050 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669042 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:21.669050 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669054 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:21.669276 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669063 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxj6s\" (UniqueName: \"kubernetes.io/projected/0b2c23bf-3ebe-44f9-ba09-6b430d533106-kube-api-access-wxj6s\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:21.669276 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669072 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b2c23bf-3ebe-44f9-ba09-6b430d533106-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:21.669276 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:21.669080 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2c23bf-3ebe-44f9-ba09-6b430d533106-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:13:22.147554 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.147503 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 15:13:22.270860 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.270822 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5c65d4dbd9-hllt5_0b2c23bf-3ebe-44f9-ba09-6b430d533106/main/0.log" Apr 16 15:13:22.271344 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.271217 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerID="607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5" exitCode=137 Apr 16 15:13:22.271344 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.271294 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" Apr 16 15:13:22.271344 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.271294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerDied","Data":"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5"} Apr 16 15:13:22.271344 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.271336 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5" event={"ID":"0b2c23bf-3ebe-44f9-ba09-6b430d533106","Type":"ContainerDied","Data":"1d38a295b910aaa04d40a377269adcbf475675ad1d0ddbb0255832e2466c60e7"} Apr 16 15:13:22.271580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.271352 2579 scope.go:117] "RemoveContainer" containerID="607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5" Apr 16 15:13:22.290958 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.290788 2579 scope.go:117] "RemoveContainer" containerID="43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd" Apr 16 15:13:22.295025 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.294994 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:13:22.298373 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.298337 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-hllt5"] Apr 16 15:13:22.304041 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.304020 2579 scope.go:117] "RemoveContainer" containerID="607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5" Apr 16 15:13:22.304413 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:13:22.304388 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5\": container with ID starting with 607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5 not found: ID does not exist" containerID="607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5" Apr 16 15:13:22.304522 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.304423 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5"} err="failed to get container status \"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5\": rpc error: code = NotFound desc = could not find container \"607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5\": container with ID starting with 607c4d6a8954e794245b09bb791a1ea34e162db6cab3ede4aea106c14079f4a5 not found: ID does not exist" Apr 16 15:13:22.304522 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.304442 2579 scope.go:117] "RemoveContainer" containerID="43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd" Apr 16 15:13:22.304695 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:13:22.304678 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd\": container with ID starting with 43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd not found: ID does not exist" containerID="43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd" Apr 16 15:13:22.304750 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:22.304699 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd"} err="failed to get container status \"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd\": rpc error: code = NotFound desc = could not find container \"43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd\": container with ID starting with 43548c124712a7e58b0d85fd3a85b2d671e7319d4c207eb7d527d30e363887fd not found: ID does not exist" Apr 16 15:13:23.354892 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:23.354860 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" path="/var/lib/kubelet/pods/0b2c23bf-3ebe-44f9-ba09-6b430d533106/volumes" Apr 16 15:13:25.466032 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:25.465988 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:25.466032 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:25.466041 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:13:25.467933 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:25.467892 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:13:32.157499 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:32.157468 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:13:32.165595 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:32.165558 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:13:35.466383 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:35.466330 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:13:45.467234 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:45.467194 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:13:45.881207 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:45.881170 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:13:45.881623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:45.881550 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" containerID="cri-o://67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59" gracePeriod=30 Apr 16 15:13:55.467313 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:13:55.467256 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:00.146181 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146147 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:14:00.146589 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146453 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="storage-initializer" Apr 16 15:14:00.146589 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146464 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="storage-initializer" Apr 16 15:14:00.146589 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146481 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" Apr 16 15:14:00.146589 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146487 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" Apr 16 15:14:00.146589 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.146540 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b2c23bf-3ebe-44f9-ba09-6b430d533106" containerName="main" Apr 16 15:14:00.149670 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.149652 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.152067 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.152037 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 15:14:00.159546 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.159521 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:14:00.206527 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.206527 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206525 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvn7\" (UniqueName: \"kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.206754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.206754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.206754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.206754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.206661 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.307917 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.307866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308107 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.307941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308107 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.307976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308107 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.307993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308107 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.308053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308107 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.308081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvn7\" (UniqueName: \"kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308447 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.308421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308696 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.308667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.308889 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.308733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.310802 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.310780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.311422 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.311387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.315716 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.315681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvn7\" (UniqueName: \"kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7\") pod \"router-with-refs-test-kserve-685684b6bf-zxm6k\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.461273 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.461182 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:00.620251 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:00.620217 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:14:00.624527 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:14:00.624498 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fab0ec6_79f0_4f5b_93e4_761651092e95.slice/crio-aba58c966064fae0780fd225c43d15a0799a1f87e71d586968c2b0aef51b7385 WatchSource:0}: Error finding container aba58c966064fae0780fd225c43d15a0799a1f87e71d586968c2b0aef51b7385: Status 404 returned error can't find the container with id aba58c966064fae0780fd225c43d15a0799a1f87e71d586968c2b0aef51b7385 Apr 16 15:14:01.411559 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:01.411516 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerStarted","Data":"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236"} Apr 16 15:14:01.412008 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:01.411567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerStarted","Data":"aba58c966064fae0780fd225c43d15a0799a1f87e71d586968c2b0aef51b7385"} Apr 16 15:14:05.425332 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:05.425298 2579 generic.go:358] "Generic (PLEG): container finished" podID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerID="d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236" exitCode=0 Apr 16 15:14:05.425332 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:05.425338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerDied","Data":"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236"} Apr 16 15:14:05.466950 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:05.466894 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:06.433943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:06.433883 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerStarted","Data":"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c"} Apr 16 15:14:06.454205 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:06.454151 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podStartSLOduration=6.454132795 podStartE2EDuration="6.454132795s" podCreationTimestamp="2026-04-16 15:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:06.451718018 +0000 UTC m=+1295.634255792" watchObservedRunningTime="2026-04-16 15:14:06.454132795 +0000 UTC m=+1295.636670570" Apr 16 15:14:10.461701 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:10.461654 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:10.461701 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:10.461712 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:14:10.463473 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:10.463438 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:14:15.466680 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:15.466626 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:16.226821 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.226793 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-c9cd6ff97-qnr8c_0ba877d1-55ac-4d58-817d-544546b8b982/main/0.log" Apr 16 15:14:16.227237 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.227215 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:14:16.355089 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355045 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355290 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355101 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355290 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355125 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355290 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355149 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355290 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355269 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwk7t\" (UniqueName: \"kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355478 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355295 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache\") pod \"0ba877d1-55ac-4d58-817d-544546b8b982\" (UID: \"0ba877d1-55ac-4d58-817d-544546b8b982\") " Apr 16 15:14:16.355832 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355536 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home" (OuterVolumeSpecName: "home") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:16.355832 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.355698 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache" (OuterVolumeSpecName: "model-cache") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:16.357545 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.357513 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:14:16.357758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.357731 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm" (OuterVolumeSpecName: "dshm") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:16.358317 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.358288 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t" (OuterVolumeSpecName: "kube-api-access-kwk7t") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "kube-api-access-kwk7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:16.410714 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.410670 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ba877d1-55ac-4d58-817d-544546b8b982" (UID: "0ba877d1-55ac-4d58-817d-544546b8b982"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456847 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kwk7t\" (UniqueName: \"kubernetes.io/projected/0ba877d1-55ac-4d58-817d-544546b8b982-kube-api-access-kwk7t\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456880 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456890 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456918 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456928 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ba877d1-55ac-4d58-817d-544546b8b982-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.456970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.456936 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba877d1-55ac-4d58-817d-544546b8b982-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:14:16.472786 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.472753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-c9cd6ff97-qnr8c_0ba877d1-55ac-4d58-817d-544546b8b982/main/0.log" Apr 16 15:14:16.473286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.473156 2579 generic.go:358] "Generic (PLEG): container finished" podID="0ba877d1-55ac-4d58-817d-544546b8b982" containerID="67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59" exitCode=137 Apr 16 15:14:16.473286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.473215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerDied","Data":"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59"} Apr 16 15:14:16.473286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.473245 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" event={"ID":"0ba877d1-55ac-4d58-817d-544546b8b982","Type":"ContainerDied","Data":"6cd175df943021671c8ba0472118e4af902860b89bb0b02882fcf992e19225d7"} Apr 16 15:14:16.473286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.473261 2579 scope.go:117] "RemoveContainer" containerID="67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59" Apr 16 15:14:16.473286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.473265 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c" Apr 16 15:14:16.493459 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.493433 2579 scope.go:117] "RemoveContainer" containerID="6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101" Apr 16 15:14:16.497885 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.497823 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:14:16.499922 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.499883 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-c9cd6ff97-qnr8c"] Apr 16 15:14:16.557130 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.557103 2579 scope.go:117] "RemoveContainer" containerID="67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59" Apr 16 15:14:16.557526 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:14:16.557507 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59\": container with ID starting with 67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59 not found: ID does not exist" containerID="67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59" Apr 16 15:14:16.557614 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.557539 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59"} err="failed to get container status \"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59\": rpc error: code = NotFound desc = could not find container \"67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59\": container with ID starting with 67bc49252b391a2c3857ad955e81fc78dff75bd9102ebef683bf4841e0ca0e59 not found: ID does not exist" Apr 16 15:14:16.557614 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.557560 2579 scope.go:117] "RemoveContainer" containerID="6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101" Apr 16 15:14:16.557844 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:14:16.557824 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101\": container with ID starting with 6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101 not found: ID does not exist" containerID="6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101" Apr 16 15:14:16.557927 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:16.557850 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101"} err="failed to get container status \"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101\": rpc error: code = NotFound desc = could not find container \"6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101\": container with ID starting with 6134282e9a7448b7a350769aa51d53614ba7e838dad832b956237751751b0101 not found: ID does not exist" Apr 16 15:14:17.356112 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:17.356080 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" path="/var/lib/kubelet/pods/0ba877d1-55ac-4d58-817d-544546b8b982/volumes" Apr 16 15:14:20.462301 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:20.462241 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:14:25.466358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:25.466312 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:30.462048 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:30.461992 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:14:35.466731 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:35.466681 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:40.462129 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:40.462081 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:14:45.466566 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:45.466519 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:14:50.462013 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:50.461883 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:14:55.467242 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:14:55.467196 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 15:15:00.462005 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:00.461948 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:15:05.477333 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:05.477302 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:15:05.485497 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:05.485457 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:15:06.851632 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:06.851596 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:15:06.852060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:06.851947 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" containerID="cri-o://34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3" gracePeriod=30 Apr 16 15:15:10.462486 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:10.462442 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:15:20.462170 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:20.462123 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:15:30.462687 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:30.462632 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 15:15:37.133418 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.133388 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5c65d4dbd9-h86zq_1ce9d9d4-356a-488e-8f90-996b73151857/main/0.log" Apr 16 15:15:37.134274 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.134254 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:15:37.240969 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.240915 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.240969 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.240976 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.241219 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241026 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.241219 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241064 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.241340 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241289 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ckw7\" (UniqueName: \"kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.241404 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241371 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache\") pod \"1ce9d9d4-356a-488e-8f90-996b73151857\" (UID: \"1ce9d9d4-356a-488e-8f90-996b73151857\") " Apr 16 15:15:37.241540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241363 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home" (OuterVolumeSpecName: "home") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:37.241678 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241639 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.242005 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.241951 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache" (OuterVolumeSpecName: "model-cache") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:37.244570 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.244298 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:15:37.245217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.245168 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm" (OuterVolumeSpecName: "dshm") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:37.245217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.245183 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7" (OuterVolumeSpecName: "kube-api-access-7ckw7") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "kube-api-access-7ckw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:15:37.307112 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.307054 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ce9d9d4-356a-488e-8f90-996b73151857" (UID: "1ce9d9d4-356a-488e-8f90-996b73151857"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:37.342536 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.342495 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.342536 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.342534 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.342791 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.342551 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9d9d4-356a-488e-8f90-996b73151857-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.342791 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.342567 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce9d9d4-356a-488e-8f90-996b73151857-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.342791 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.342585 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ckw7\" (UniqueName: \"kubernetes.io/projected/1ce9d9d4-356a-488e-8f90-996b73151857-kube-api-access-7ckw7\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:15:37.745776 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.745684 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5c65d4dbd9-h86zq_1ce9d9d4-356a-488e-8f90-996b73151857/main/0.log" Apr 16 15:15:37.746370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.746339 2579 generic.go:358] "Generic (PLEG): container finished" podID="1ce9d9d4-356a-488e-8f90-996b73151857" containerID="34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3" exitCode=137 Apr 16 15:15:37.746494 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.746453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerDied","Data":"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3"} Apr 16 15:15:37.746559 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.746495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" event={"ID":"1ce9d9d4-356a-488e-8f90-996b73151857","Type":"ContainerDied","Data":"06e9651d87e9a28dd81d8bd0cc7c1d08359dca77134c8069491897a30d9ba632"} Apr 16 15:15:37.746559 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.746516 2579 scope.go:117] "RemoveContainer" containerID="34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3" Apr 16 15:15:37.746672 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.746515 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq" Apr 16 15:15:37.765872 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.765831 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:15:37.771966 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.771943 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5c65d4dbd9-h86zq"] Apr 16 15:15:37.774008 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.773981 2579 scope.go:117] "RemoveContainer" containerID="5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b" Apr 16 15:15:37.850980 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.850948 2579 scope.go:117] "RemoveContainer" containerID="34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3" Apr 16 15:15:37.851407 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:15:37.851377 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3\": container with ID starting with 34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3 not found: ID does not exist" containerID="34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3" Apr 16 15:15:37.851540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.851414 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3"} err="failed to get container status \"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3\": rpc error: code = NotFound desc = could not find container \"34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3\": container with ID starting with 34bec340e0c278c3c7ebace171a39ae7f7fcdf713f66fed599d5b0764ecc57d3 not found: ID does not exist" Apr 16 15:15:37.851540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.851441 2579 scope.go:117] "RemoveContainer" containerID="5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b" Apr 16 15:15:37.851769 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:15:37.851748 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b\": container with ID starting with 5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b not found: ID does not exist" containerID="5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b" Apr 16 15:15:37.851847 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:37.851778 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b"} err="failed to get container status \"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b\": rpc error: code = NotFound desc = could not find container \"5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b\": container with ID starting with 5d60d241ab449c1a8b10da8ed2dba6c52b825b5d138006e8606758576db4c16b not found: ID does not exist" Apr 16 15:15:39.354433 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:39.354392 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" path="/var/lib/kubelet/pods/1ce9d9d4-356a-488e-8f90-996b73151857/volumes" Apr 16 15:15:40.471818 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:40.471789 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:15:40.482648 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:15:40.482614 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:16:09.674270 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674234 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674514 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="storage-initializer" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674525 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="storage-initializer" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674537 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="storage-initializer" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674542 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="storage-initializer" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674551 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674556 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674563 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674568 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674619 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ba877d1-55ac-4d58-817d-544546b8b982" containerName="main" Apr 16 15:16:09.674684 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.674627 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce9d9d4-356a-488e-8f90-996b73151857" containerName="main" Apr 16 15:16:09.676863 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.676847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.679284 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.679259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 15:16:09.679401 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.679369 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-xvj8f\"" Apr 16 15:16:09.690921 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.689266 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:16:09.731045 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.731015 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:16:09.733574 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.733551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.745574 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.745549 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:16:09.808245 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808283 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6rz\" (UniqueName: \"kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.808501 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.808798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.808798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.808798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.808798 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.808676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8n7r\" (UniqueName: \"kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909336 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909534 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8n7r\" (UniqueName: \"kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909534 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909534 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.909534 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6rz\" (UniqueName: \"kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.909754 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.910090 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.910090 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.910090 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.910090 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.909894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.910279 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.910120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.910279 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.910120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.910279 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.910268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.912165 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.912130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.912296 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.912243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.912418 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.912396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.912482 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.912448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.918889 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.918865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8n7r\" (UniqueName: \"kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:09.919002 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.918923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6rz\" (UniqueName: \"kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:09.986666 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:09.986570 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:10.045288 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.044972 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:10.123959 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.123880 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:16:10.127203 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:16:10.127166 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380feacd_c563_4cf9_b4ae_3b2ad929ee2e.slice/crio-57113a0cf9d99250060dc67f2c80395cca0f591591ae3d36e53f8b364572a911 WatchSource:0}: Error finding container 57113a0cf9d99250060dc67f2c80395cca0f591591ae3d36e53f8b364572a911: Status 404 returned error can't find the container with id 57113a0cf9d99250060dc67f2c80395cca0f591591ae3d36e53f8b364572a911 Apr 16 15:16:10.186185 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.186160 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:16:10.188523 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:16:10.188498 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6957c6a7_2da6_4f49_891e_550249e065fa.slice/crio-c4fbfe54bfc1e113cebdae0ed2ccd19a3d3d9188f6dadee4833d38414ff46318 WatchSource:0}: Error finding container c4fbfe54bfc1e113cebdae0ed2ccd19a3d3d9188f6dadee4833d38414ff46318: Status 404 returned error can't find the container with id c4fbfe54bfc1e113cebdae0ed2ccd19a3d3d9188f6dadee4833d38414ff46318 Apr 16 15:16:10.851970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.851932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerStarted","Data":"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e"} Apr 16 15:16:10.851970 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.851974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerStarted","Data":"c4fbfe54bfc1e113cebdae0ed2ccd19a3d3d9188f6dadee4833d38414ff46318"} Apr 16 15:16:10.853139 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:10.853114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerStarted","Data":"57113a0cf9d99250060dc67f2c80395cca0f591591ae3d36e53f8b364572a911"} Apr 16 15:16:11.858880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:11.858781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerStarted","Data":"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c"} Apr 16 15:16:11.859370 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:11.859103 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:12.117550 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:12.117448 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:16:12.117875 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:12.117832 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" containerID="cri-o://b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c" gracePeriod=30 Apr 16 15:16:12.864573 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:12.864539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerStarted","Data":"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609"} Apr 16 15:16:14.873576 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:14.873491 2579 generic.go:358] "Generic (PLEG): container finished" podID="6957c6a7-2da6-4f49-891e-550249e065fa" containerID="465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e" exitCode=0 Apr 16 15:16:14.873576 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:14.873552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerDied","Data":"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e"} Apr 16 15:16:15.879472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:15.879439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerStarted","Data":"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da"} Apr 16 15:16:15.900142 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:15.900087 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podStartSLOduration=6.900066526 podStartE2EDuration="6.900066526s" podCreationTimestamp="2026-04-16 15:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:15.898249288 +0000 UTC m=+1425.080787062" watchObservedRunningTime="2026-04-16 15:16:15.900066526 +0000 UTC m=+1425.082604301" Apr 16 15:16:16.884788 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:16.884742 2579 generic.go:358] "Generic (PLEG): container finished" podID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerID="adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609" exitCode=0 Apr 16 15:16:16.885273 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:16.884813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerDied","Data":"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609"} Apr 16 15:16:17.889543 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:17.889506 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerStarted","Data":"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9"} Apr 16 15:16:17.911853 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:17.911793 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podStartSLOduration=7.921656009 podStartE2EDuration="8.911774645s" podCreationTimestamp="2026-04-16 15:16:09 +0000 UTC" firstStartedPulling="2026-04-16 15:16:10.129186502 +0000 UTC m=+1419.311724255" lastFinishedPulling="2026-04-16 15:16:11.119305136 +0000 UTC m=+1420.301842891" observedRunningTime="2026-04-16 15:16:17.911023659 +0000 UTC m=+1427.093561474" watchObservedRunningTime="2026-04-16 15:16:17.911774645 +0000 UTC m=+1427.094312420" Apr 16 15:16:19.986947 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:19.986846 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:19.986947 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:19.986894 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:19.988412 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:19.988388 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:16:20.046161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:20.046121 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:20.046161 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:20.046173 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:16:20.047787 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:20.047755 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:16:29.987603 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:29.987541 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:16:30.001198 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:30.001164 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:16:30.046030 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:30.045983 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:16:37.661069 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.661035 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:16:37.685985 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.685958 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:16:37.686158 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.686091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.689464 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.689436 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 15:16:37.756034 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.755999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.756034 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.756037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.756249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.756110 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.756249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.756135 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.756249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.756185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.756249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.756212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xghpr\" (UniqueName: \"kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857431 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857639 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857639 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857639 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857820 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857820 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xghpr\" (UniqueName: \"kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857986 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.857986 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.857892 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.858235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.858203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.859983 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.859956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.860121 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.860102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.866217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.866189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xghpr\" (UniqueName: \"kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:37.996755 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:37.996672 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:38.145396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:38.145360 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:16:38.149094 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:16:38.149062 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7641919_79c9_46bf_a715_0969c96cecd9.slice/crio-b94b3515752c2c08001d1d91746909385829faa2e44b490fe3e6771e554d0046 WatchSource:0}: Error finding container b94b3515752c2c08001d1d91746909385829faa2e44b490fe3e6771e554d0046: Status 404 returned error can't find the container with id b94b3515752c2c08001d1d91746909385829faa2e44b490fe3e6771e554d0046 Apr 16 15:16:38.969474 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:38.969437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerStarted","Data":"22333c18e845e23abe959399f9ca8605e6126d8ac6162d0525e9bcd2a37e3b6e"} Apr 16 15:16:38.969474 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:38.969477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerStarted","Data":"b94b3515752c2c08001d1d91746909385829faa2e44b490fe3e6771e554d0046"} Apr 16 15:16:39.987988 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:39.987943 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:16:40.046600 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:40.046552 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:16:42.430176 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.430146 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-685684b6bf-zxm6k_5fab0ec6-79f0-4f5b-93e4-761651092e95/main/0.log" Apr 16 15:16:42.430675 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.430655 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:16:42.502113 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502078 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502132 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502216 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502255 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvn7\" (UniqueName: \"kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502291 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502515 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502330 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home\") pod \"5fab0ec6-79f0-4f5b-93e4-761651092e95\" (UID: \"5fab0ec6-79f0-4f5b-93e4-761651092e95\") " Apr 16 15:16:42.502808 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502773 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache" (OuterVolumeSpecName: "model-cache") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:42.502961 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.502923 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home" (OuterVolumeSpecName: "home") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:42.505248 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.505215 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7" (OuterVolumeSpecName: "kube-api-access-sfvn7") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "kube-api-access-sfvn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:42.505602 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.505571 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm" (OuterVolumeSpecName: "dshm") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:42.509184 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.509156 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:42.567650 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.567613 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5fab0ec6-79f0-4f5b-93e4-761651092e95" (UID: "5fab0ec6-79f0-4f5b-93e4-761651092e95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:42.603745 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603706 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.603745 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603733 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.603745 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603742 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.603745 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603752 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfvn7\" (UniqueName: \"kubernetes.io/projected/5fab0ec6-79f0-4f5b-93e4-761651092e95-kube-api-access-sfvn7\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.604105 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603763 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fab0ec6-79f0-4f5b-93e4-761651092e95-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.604105 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.603772 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fab0ec6-79f0-4f5b-93e4-761651092e95-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.986084 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986057 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-685684b6bf-zxm6k_5fab0ec6-79f0-4f5b-93e4-761651092e95/main/0.log" Apr 16 15:16:42.986474 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986448 2579 generic.go:358] "Generic (PLEG): container finished" podID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerID="b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c" exitCode=137 Apr 16 15:16:42.986586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986519 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerDied","Data":"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c"} Apr 16 15:16:42.986586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986548 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" Apr 16 15:16:42.986586 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986565 2579 scope.go:117] "RemoveContainer" containerID="b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c" Apr 16 15:16:42.986740 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.986551 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k" event={"ID":"5fab0ec6-79f0-4f5b-93e4-761651092e95","Type":"ContainerDied","Data":"aba58c966064fae0780fd225c43d15a0799a1f87e71d586968c2b0aef51b7385"} Apr 16 15:16:42.988272 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.988235 2579 generic.go:358] "Generic (PLEG): container finished" podID="e7641919-79c9-46bf-a715-0969c96cecd9" containerID="22333c18e845e23abe959399f9ca8605e6126d8ac6162d0525e9bcd2a37e3b6e" exitCode=0 Apr 16 15:16:42.988384 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:42.988303 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerDied","Data":"22333c18e845e23abe959399f9ca8605e6126d8ac6162d0525e9bcd2a37e3b6e"} Apr 16 15:16:43.007800 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.007670 2579 scope.go:117] "RemoveContainer" containerID="d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236" Apr 16 15:16:43.022986 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.022959 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:16:43.026144 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.026122 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-685684b6bf-zxm6k"] Apr 16 15:16:43.074390 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.074359 2579 scope.go:117] "RemoveContainer" containerID="b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c" Apr 16 15:16:43.074762 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:16:43.074734 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c\": container with ID starting with b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c not found: ID does not exist" containerID="b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c" Apr 16 15:16:43.074859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.074777 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c"} err="failed to get container status \"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c\": rpc error: code = NotFound desc = could not find container \"b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c\": container with ID starting with b9d2e9888e1538b47583588620eb059c00ca4c3ac61ba53d49c3b3b6cd26830c not found: ID does not exist" Apr 16 15:16:43.074859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.074808 2579 scope.go:117] "RemoveContainer" containerID="d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236" Apr 16 15:16:43.075157 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:16:43.075126 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236\": container with ID starting with d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236 not found: ID does not exist" containerID="d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236" Apr 16 15:16:43.075209 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.075159 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236"} err="failed to get container status \"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236\": rpc error: code = NotFound desc = could not find container \"d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236\": container with ID starting with d6170545465409ce014879498e6f304f8dfedf87abd0ef83dfe38bf334a23236 not found: ID does not exist" Apr 16 15:16:43.356548 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.356504 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" path="/var/lib/kubelet/pods/5fab0ec6-79f0-4f5b-93e4-761651092e95/volumes" Apr 16 15:16:43.996324 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:43.996278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerStarted","Data":"f379c8f298735fcf31f5522cc3bd6c6f17b3817297096724e4582ef81ce00af5"} Apr 16 15:16:44.018058 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:44.017993 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podStartSLOduration=7.017978625 podStartE2EDuration="7.017978625s" podCreationTimestamp="2026-04-16 15:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:44.015379691 +0000 UTC m=+1453.197917465" watchObservedRunningTime="2026-04-16 15:16:44.017978625 +0000 UTC m=+1453.200516399" Apr 16 15:16:47.996997 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:47.996944 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:47.996997 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:47.997005 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:16:47.999077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:47.999044 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:16:49.987324 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:49.987270 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:16:50.046602 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:50.046554 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:16:57.998184 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:57.998130 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:16:59.987960 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:16:59.987888 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:00.046142 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:00.046093 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:07.997813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:07.997765 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:09.987237 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:09.987186 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:10.046291 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:10.046247 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:17.997541 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:17.997492 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:19.987761 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:19.987712 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:20.045794 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:20.045752 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:27.997598 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:27.997556 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:29.987817 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:29.987759 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:30.046490 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:30.046440 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:37.997873 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:37.997814 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:39.987800 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:39.987744 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:40.045914 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:40.045864 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:47.997509 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:47.997463 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:49.987099 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:49.986994 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:17:50.045871 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:50.045825 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:17:57.997515 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:57.997463 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:17:59.987556 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:17:59.987506 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:00.046052 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:00.046004 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:07.998164 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:07.998107 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:09.987384 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:09.987329 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:10.046443 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:10.046402 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:17.997964 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:17.997922 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:19.987275 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:19.987215 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:20.046317 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:20.046267 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:27.997762 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:27.997710 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:29.987571 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:29.987522 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:30.045800 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:30.045754 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:37.998009 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:37.997959 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:39.987223 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:39.987171 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:40.046248 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:40.046205 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:47.998059 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:47.998005 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:49.987028 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:49.986988 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:18:50.046173 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:50.046133 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 15:18:57.997468 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:57.997421 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:18:59.987885 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:18:59.987834 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 15:19:00.056178 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:00.056149 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:19:00.064818 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:00.064774 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:19:07.997449 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:07.997401 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:19:09.996563 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:09.996530 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:19:10.013289 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:10.013262 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:19:17.997280 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:17.997231 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 15:19:25.078429 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:25.078331 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:19:25.081040 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:25.080115 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" containerID="cri-o://1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" gracePeriod=30 Apr 16 15:19:25.082924 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:25.082860 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:19:25.083431 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:25.083381 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" containerID="cri-o://debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da" gracePeriod=30 Apr 16 15:19:28.007802 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:28.007770 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:19:28.015736 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:28.015704 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:19:36.686190 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686155 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:19:36.686661 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686450 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" Apr 16 15:19:36.686661 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686460 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" Apr 16 15:19:36.686661 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686474 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="storage-initializer" Apr 16 15:19:36.686661 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686480 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="storage-initializer" Apr 16 15:19:36.686661 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.686536 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fab0ec6-79f0-4f5b-93e4-761651092e95" containerName="main" Apr 16 15:19:36.689628 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.689613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.692325 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.692304 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-s8426\"" Apr 16 15:19:36.692325 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.692319 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 15:19:36.700873 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.700846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:19:36.719540 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.719719 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.719719 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.719851 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719726 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.719851 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrwt\" (UniqueName: \"kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.719997 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.719869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.721565 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.721546 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:19:36.724837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.724817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.726163 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.726146 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:19:36.820393 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrwt\" (UniqueName: \"kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820476 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5mlh\" (UniqueName: \"kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.820944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.821182 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.820961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.822646 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.822616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.823023 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.823006 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.831219 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.831199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrwt\" (UniqueName: \"kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt\") pod \"custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:36.921928 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.921868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922175 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.921971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922175 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922307 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922307 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5mlh\" (UniqueName: \"kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922402 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922567 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.922642 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.922623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.924703 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.924664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.924854 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.924833 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.937974 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.937889 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5mlh\" (UniqueName: \"kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh\") pod \"custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:36.999072 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:36.999037 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:37.036768 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.036262 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:37.142848 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.142652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:19:37.145271 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:19:37.145243 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e99309_af9c_4aa1_a1ff_c22f2488ceab.slice/crio-f1f24bc2c2ddd892f399358ed53a3b9fb5a7a1bc563d2f7a0446bc3f514a4967 WatchSource:0}: Error finding container f1f24bc2c2ddd892f399358ed53a3b9fb5a7a1bc563d2f7a0446bc3f514a4967: Status 404 returned error can't find the container with id f1f24bc2c2ddd892f399358ed53a3b9fb5a7a1bc563d2f7a0446bc3f514a4967 Apr 16 15:19:37.147234 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.147218 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:19:37.179791 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.179769 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:19:37.182288 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:19:37.182264 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d817844_eeaf_4e78_a00d_66aae2ac79af.slice/crio-6fae360fb4e9e651a78a96bbac1416f3273c8950373e5de44e01115229bc8539 WatchSource:0}: Error finding container 6fae360fb4e9e651a78a96bbac1416f3273c8950373e5de44e01115229bc8539: Status 404 returned error can't find the container with id 6fae360fb4e9e651a78a96bbac1416f3273c8950373e5de44e01115229bc8539 Apr 16 15:19:37.597826 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.597784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerStarted","Data":"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f"} Apr 16 15:19:37.597826 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.597832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerStarted","Data":"6fae360fb4e9e651a78a96bbac1416f3273c8950373e5de44e01115229bc8539"} Apr 16 15:19:37.599350 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.599314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerStarted","Data":"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd"} Apr 16 15:19:37.599468 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.599353 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerStarted","Data":"f1f24bc2c2ddd892f399358ed53a3b9fb5a7a1bc563d2f7a0446bc3f514a4967"} Apr 16 15:19:37.599468 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:37.599435 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:38.607154 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:38.607112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerStarted","Data":"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2"} Apr 16 15:19:42.621775 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:42.621739 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerID="a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f" exitCode=0 Apr 16 15:19:42.622298 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:42.621822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerDied","Data":"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f"} Apr 16 15:19:42.623619 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:42.623591 2579 generic.go:358] "Generic (PLEG): container finished" podID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerID="96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2" exitCode=0 Apr 16 15:19:42.623713 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:42.623661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerDied","Data":"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2"} Apr 16 15:19:43.630687 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:43.630645 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerStarted","Data":"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150"} Apr 16 15:19:43.637808 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:43.637771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerStarted","Data":"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba"} Apr 16 15:19:43.652680 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:43.652605 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podStartSLOduration=7.652584897 podStartE2EDuration="7.652584897s" podCreationTimestamp="2026-04-16 15:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:19:43.649774575 +0000 UTC m=+1632.832312377" watchObservedRunningTime="2026-04-16 15:19:43.652584897 +0000 UTC m=+1632.835122671" Apr 16 15:19:43.670871 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:43.670790 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podStartSLOduration=7.670767741 podStartE2EDuration="7.670767741s" podCreationTimestamp="2026-04-16 15:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:19:43.669166783 +0000 UTC m=+1632.851704611" watchObservedRunningTime="2026-04-16 15:19:43.670767741 +0000 UTC m=+1632.853305527" Apr 16 15:19:46.999913 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:46.999857 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:47.000410 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.000016 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:47.001541 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.001503 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:19:47.013203 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.013176 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:19:47.036623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.036588 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:47.036623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.036630 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:19:47.040879 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:47.040835 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:19:55.079780 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.079676 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="llm-d-routing-sidecar" containerID="cri-o://d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" gracePeriod=2 Apr 16 15:19:55.548348 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.548324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf_380feacd-c563-4cf9-b4ae-3b2ad929ee2e/main/0.log" Apr 16 15:19:55.549192 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.549169 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:19:55.555164 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.555142 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:19:55.601396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.601324 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.601396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.601362 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8n7r\" (UniqueName: \"kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.601396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.601391 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd6rz\" (UniqueName: \"kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.601645 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.601621 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.601784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.601766 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.602281 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602260 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.602382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602291 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.602382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602343 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.602382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602373 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.602542 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602411 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.602542 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602436 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache\") pod \"6957c6a7-2da6-4f49-891e-550249e065fa\" (UID: \"6957c6a7-2da6-4f49-891e-550249e065fa\") " Apr 16 15:19:55.602542 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602461 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location\") pod \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\" (UID: \"380feacd-c563-4cf9-b4ae-3b2ad929ee2e\") " Apr 16 15:19:55.602761 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.602736 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home" (OuterVolumeSpecName: "home") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.606727 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.606522 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r" (OuterVolumeSpecName: "kube-api-access-f8n7r") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "kube-api-access-f8n7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:55.606834 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.606754 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache" (OuterVolumeSpecName: "model-cache") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.607196 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.606958 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache" (OuterVolumeSpecName: "model-cache") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.607512 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.607331 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home" (OuterVolumeSpecName: "home") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.607512 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.607475 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz" (OuterVolumeSpecName: "kube-api-access-zd6rz") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "kube-api-access-zd6rz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:55.607751 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.607721 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:19:55.607967 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.607927 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm" (OuterVolumeSpecName: "dshm") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.608236 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.608209 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:19:55.608756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.608727 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm" (OuterVolumeSpecName: "dshm") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.661146 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.661098 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "380feacd-c563-4cf9-b4ae-3b2ad929ee2e" (UID: "380feacd-c563-4cf9-b4ae-3b2ad929ee2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.683962 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.683923 2579 generic.go:358] "Generic (PLEG): container finished" podID="6957c6a7-2da6-4f49-891e-550249e065fa" containerID="debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da" exitCode=137 Apr 16 15:19:55.684145 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.684007 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" Apr 16 15:19:55.684145 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.683998 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerDied","Data":"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da"} Apr 16 15:19:55.684145 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.684130 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t" event={"ID":"6957c6a7-2da6-4f49-891e-550249e065fa","Type":"ContainerDied","Data":"c4fbfe54bfc1e113cebdae0ed2ccd19a3d3d9188f6dadee4833d38414ff46318"} Apr 16 15:19:55.684294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.684155 2579 scope.go:117] "RemoveContainer" containerID="debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da" Apr 16 15:19:55.685701 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.685676 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf_380feacd-c563-4cf9-b4ae-3b2ad929ee2e/main/0.log" Apr 16 15:19:55.686432 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686407 2579 generic.go:358] "Generic (PLEG): container finished" podID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerID="1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" exitCode=137 Apr 16 15:19:55.686432 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686430 2579 generic.go:358] "Generic (PLEG): container finished" podID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerID="d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" exitCode=0 Apr 16 15:19:55.686612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerDied","Data":"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9"} Apr 16 15:19:55.686612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686491 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerDied","Data":"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c"} Apr 16 15:19:55.686612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" event={"ID":"380feacd-c563-4cf9-b4ae-3b2ad929ee2e","Type":"ContainerDied","Data":"57113a0cf9d99250060dc67f2c80395cca0f591591ae3d36e53f8b364572a911"} Apr 16 15:19:55.686612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.686598 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf" Apr 16 15:19:55.690294 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.690262 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6957c6a7-2da6-4f49-891e-550249e065fa" (UID: "6957c6a7-2da6-4f49-891e-550249e065fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:55.703701 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703676 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8n7r\" (UniqueName: \"kubernetes.io/projected/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kube-api-access-f8n7r\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703701 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703702 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zd6rz\" (UniqueName: \"kubernetes.io/projected/6957c6a7-2da6-4f49-891e-550249e065fa-kube-api-access-zd6rz\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703715 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703728 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703741 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703752 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703763 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703774 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957c6a7-2da6-4f49-891e-550249e065fa-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703787 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703799 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957c6a7-2da6-4f49-891e-550249e065fa-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703811 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.703943 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.703823 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/380feacd-c563-4cf9-b4ae-3b2ad929ee2e-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:19:55.704760 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.704745 2579 scope.go:117] "RemoveContainer" containerID="465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e" Apr 16 15:19:55.710633 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.710604 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:19:55.713547 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.713524 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-797f447c889vskf"] Apr 16 15:19:55.775431 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.775407 2579 scope.go:117] "RemoveContainer" containerID="debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da" Apr 16 15:19:55.775836 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:19:55.775808 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da\": container with ID starting with debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da not found: ID does not exist" containerID="debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da" Apr 16 15:19:55.776001 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.775852 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da"} err="failed to get container status \"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da\": rpc error: code = NotFound desc = could not find container \"debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da\": container with ID starting with debabf1409fc048b6e5641fc240fc51af0dc31276d67cfc993e9e82779a7b6da not found: ID does not exist" Apr 16 15:19:55.776001 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.775884 2579 scope.go:117] "RemoveContainer" containerID="465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e" Apr 16 15:19:55.776265 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:19:55.776237 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e\": container with ID starting with 465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e not found: ID does not exist" containerID="465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e" Apr 16 15:19:55.776315 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.776273 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e"} err="failed to get container status \"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e\": rpc error: code = NotFound desc = could not find container \"465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e\": container with ID starting with 465ce9bfa2a8b72af78f2d6b6d888302e94ba8aa5e110a1a074202064918020e not found: ID does not exist" Apr 16 15:19:55.776315 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.776297 2579 scope.go:117] "RemoveContainer" containerID="1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" Apr 16 15:19:55.798103 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.798084 2579 scope.go:117] "RemoveContainer" containerID="adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609" Apr 16 15:19:55.842608 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.842586 2579 scope.go:117] "RemoveContainer" containerID="d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" Apr 16 15:19:55.850765 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.850739 2579 scope.go:117] "RemoveContainer" containerID="1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" Apr 16 15:19:55.851067 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:19:55.851038 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9\": container with ID starting with 1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9 not found: ID does not exist" containerID="1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" Apr 16 15:19:55.851165 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851078 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9"} err="failed to get container status \"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9\": rpc error: code = NotFound desc = could not find container \"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9\": container with ID starting with 1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9 not found: ID does not exist" Apr 16 15:19:55.851165 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851108 2579 scope.go:117] "RemoveContainer" containerID="adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609" Apr 16 15:19:55.851430 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:19:55.851382 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609\": container with ID starting with adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609 not found: ID does not exist" containerID="adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609" Apr 16 15:19:55.851430 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851411 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609"} err="failed to get container status \"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609\": rpc error: code = NotFound desc = could not find container \"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609\": container with ID starting with adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609 not found: ID does not exist" Apr 16 15:19:55.851430 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851428 2579 scope.go:117] "RemoveContainer" containerID="d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" Apr 16 15:19:55.851668 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:19:55.851649 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c\": container with ID starting with d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c not found: ID does not exist" containerID="d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" Apr 16 15:19:55.851731 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851672 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c"} err="failed to get container status \"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c\": rpc error: code = NotFound desc = could not find container \"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c\": container with ID starting with d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c not found: ID does not exist" Apr 16 15:19:55.851731 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851689 2579 scope.go:117] "RemoveContainer" containerID="1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9" Apr 16 15:19:55.851988 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851965 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9"} err="failed to get container status \"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9\": rpc error: code = NotFound desc = could not find container \"1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9\": container with ID starting with 1ec0ad0a3cfdc83bc77cd23e6f3f276a7b44fae7497b063a918f8639672b13b9 not found: ID does not exist" Apr 16 15:19:55.851988 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.851988 2579 scope.go:117] "RemoveContainer" containerID="adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609" Apr 16 15:19:55.852243 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.852216 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609"} err="failed to get container status \"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609\": rpc error: code = NotFound desc = could not find container \"adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609\": container with ID starting with adfff3fa81fb927b05e5689ba3583551fb767dd59de99eb36905928a3d347609 not found: ID does not exist" Apr 16 15:19:55.852302 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.852246 2579 scope.go:117] "RemoveContainer" containerID="d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c" Apr 16 15:19:55.852482 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:55.852460 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c"} err="failed to get container status \"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c\": rpc error: code = NotFound desc = could not find container \"d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c\": container with ID starting with d936422a42c03455cb7a897b4a1d608d1256633a1c62dc859cfe51807231dd8c not found: ID does not exist" Apr 16 15:19:56.006056 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:56.006020 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:19:56.010728 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:56.010699 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-582h49t"] Apr 16 15:19:56.999973 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:56.999924 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:19:57.037133 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:57.037081 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:19:57.355403 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:57.355302 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" path="/var/lib/kubelet/pods/380feacd-c563-4cf9-b4ae-3b2ad929ee2e/volumes" Apr 16 15:19:57.356050 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:19:57.356029 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" path="/var/lib/kubelet/pods/6957c6a7-2da6-4f49-891e-550249e065fa/volumes" Apr 16 15:20:03.704929 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:03.704881 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:20:03.706162 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:03.706128 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" containerID="cri-o://f379c8f298735fcf31f5522cc3bd6c6f17b3817297096724e4582ef81ce00af5" gracePeriod=30 Apr 16 15:20:07.000154 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:07.000078 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:07.036942 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:07.036874 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:17.000376 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:17.000321 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:17.037451 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:17.037401 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:26.999959 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:26.999893 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:27.037482 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.037431 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:27.107331 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107296 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:20:27.107634 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107621 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="storage-initializer" Apr 16 15:20:27.107682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107639 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="storage-initializer" Apr 16 15:20:27.107682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107652 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" Apr 16 15:20:27.107682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107658 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" Apr 16 15:20:27.107682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107669 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="storage-initializer" Apr 16 15:20:27.107682 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107676 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="storage-initializer" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107691 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="llm-d-routing-sidecar" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107696 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="llm-d-routing-sidecar" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107705 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107710 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107761 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="llm-d-routing-sidecar" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107770 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6957c6a7-2da6-4f49-891e-550249e065fa" containerName="main" Apr 16 15:20:27.107845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.107777 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="380feacd-c563-4cf9-b4ae-3b2ad929ee2e" containerName="main" Apr 16 15:20:27.112545 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.112515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.115110 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.115074 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 15:20:27.115859 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.115832 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-75nnz\"" Apr 16 15:20:27.125318 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.125293 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:20:27.192784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.192784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.193060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.193060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.193060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f76hh\" (UniqueName: \"kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.193060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.192952 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294495 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294697 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294697 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f76hh\" (UniqueName: \"kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294824 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294824 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.294959 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.294839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.295177 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.295140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.295343 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.295303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.295547 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.295497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.297195 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.297163 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.297662 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.297641 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.302805 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.302773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f76hh\" (UniqueName: \"kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.426178 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.426140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:27.565356 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.565278 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:20:27.568963 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:20:27.568930 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1894e0ef_e31e_46f5_9135_e94d5ab6a0cd.slice/crio-617f183e9887ee18a9876441f40a863ba73ae39834ce428b0c6a051301721fdb WatchSource:0}: Error finding container 617f183e9887ee18a9876441f40a863ba73ae39834ce428b0c6a051301721fdb: Status 404 returned error can't find the container with id 617f183e9887ee18a9876441f40a863ba73ae39834ce428b0c6a051301721fdb Apr 16 15:20:27.805077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.805005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerStarted","Data":"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee"} Apr 16 15:20:27.805077 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:27.805057 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerStarted","Data":"617f183e9887ee18a9876441f40a863ba73ae39834ce428b0c6a051301721fdb"} Apr 16 15:20:32.824480 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:32.824444 2579 generic.go:358] "Generic (PLEG): container finished" podID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerID="8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee" exitCode=0 Apr 16 15:20:32.824877 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:32.824520 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerDied","Data":"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee"} Apr 16 15:20:33.829397 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.829374 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w_e7641919-79c9-46bf-a715-0969c96cecd9/main/0.log" Apr 16 15:20:33.829810 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.829782 2579 generic.go:358] "Generic (PLEG): container finished" podID="e7641919-79c9-46bf-a715-0969c96cecd9" containerID="f379c8f298735fcf31f5522cc3bd6c6f17b3817297096724e4582ef81ce00af5" exitCode=137 Apr 16 15:20:33.829949 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.829928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerDied","Data":"f379c8f298735fcf31f5522cc3bd6c6f17b3817297096724e4582ef81ce00af5"} Apr 16 15:20:33.832358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.831753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerStarted","Data":"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a"} Apr 16 15:20:33.854698 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.854623 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.854603741 podStartE2EDuration="6.854603741s" podCreationTimestamp="2026-04-16 15:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:20:33.851715932 +0000 UTC m=+1683.034253743" watchObservedRunningTime="2026-04-16 15:20:33.854603741 +0000 UTC m=+1683.037141516" Apr 16 15:20:33.980318 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.980288 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w_e7641919-79c9-46bf-a715-0969c96cecd9/main/0.log" Apr 16 15:20:33.980752 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:33.980731 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:20:34.055131 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055095 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055131 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055141 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055160 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xghpr\" (UniqueName: \"kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055195 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055259 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055382 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055332 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location\") pod \"e7641919-79c9-46bf-a715-0969c96cecd9\" (UID: \"e7641919-79c9-46bf-a715-0969c96cecd9\") " Apr 16 15:20:34.055611 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055509 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home" (OuterVolumeSpecName: "home") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:34.055764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.055739 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache" (OuterVolumeSpecName: "model-cache") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:34.057855 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.057824 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm" (OuterVolumeSpecName: "dshm") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:34.058034 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.057976 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr" (OuterVolumeSpecName: "kube-api-access-xghpr") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "kube-api-access-xghpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:20:34.058345 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.058315 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:20:34.099755 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.099706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7641919-79c9-46bf-a715-0969c96cecd9" (UID: "e7641919-79c9-46bf-a715-0969c96cecd9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:34.156094 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156047 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7641919-79c9-46bf-a715-0969c96cecd9-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.156094 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156092 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.156342 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156110 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.156342 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156124 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.156342 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156138 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e7641919-79c9-46bf-a715-0969c96cecd9-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.156342 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.156151 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xghpr\" (UniqueName: \"kubernetes.io/projected/e7641919-79c9-46bf-a715-0969c96cecd9-kube-api-access-xghpr\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:20:34.837327 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.837297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w_e7641919-79c9-46bf-a715-0969c96cecd9/main/0.log" Apr 16 15:20:34.837925 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.837767 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" Apr 16 15:20:34.837925 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.837774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w" event={"ID":"e7641919-79c9-46bf-a715-0969c96cecd9","Type":"ContainerDied","Data":"b94b3515752c2c08001d1d91746909385829faa2e44b490fe3e6771e554d0046"} Apr 16 15:20:34.837925 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.837829 2579 scope.go:117] "RemoveContainer" containerID="f379c8f298735fcf31f5522cc3bd6c6f17b3817297096724e4582ef81ce00af5" Apr 16 15:20:34.860350 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.860319 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:20:34.864928 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.864878 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-76b5d59b9fc5g2w"] Apr 16 15:20:34.867447 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:34.867424 2579 scope.go:117] "RemoveContainer" containerID="22333c18e845e23abe959399f9ca8605e6126d8ac6162d0525e9bcd2a37e3b6e" Apr 16 15:20:35.355839 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:35.355795 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" path="/var/lib/kubelet/pods/e7641919-79c9-46bf-a715-0969c96cecd9/volumes" Apr 16 15:20:37.001045 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:37.000609 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:37.037072 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:37.037028 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:37.426922 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:37.426795 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:37.428422 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:37.428391 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:20:46.999544 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:46.999494 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:47.037268 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:47.037226 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:47.427177 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:47.427074 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:20:56.999846 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:56.999737 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:20:57.037483 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:57.037434 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:20:57.427335 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:57.427248 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:20:57.427660 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:20:57.427628 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:07.000310 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:07.000257 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:21:07.037217 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:07.037174 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:07.426884 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:07.426779 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:17.000249 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:17.000198 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:21:17.036952 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:17.036890 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:17.426989 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:17.426876 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:26.999856 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:26.999800 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:21:27.037258 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:27.037218 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:27.427271 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:27.427169 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:36.999962 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:36.999916 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:21:37.037159 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:37.037105 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:37.427283 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:37.427193 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:46.999862 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:46.999817 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 16 15:21:47.037418 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:47.037378 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:47.426925 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:47.426810 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:21:57.009728 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:57.009693 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:21:57.023445 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:57.023412 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:21:57.036817 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:57.036775 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:21:57.427266 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:21:57.427179 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:22:07.037165 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:07.037116 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:22:07.427599 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:07.427491 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:22:17.036847 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:17.036804 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:22:17.426725 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:17.426619 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:22:27.037081 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:27.037035 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:22:27.426956 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:27.426846 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:22:37.037518 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:37.037479 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 16 15:22:37.427406 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:37.427303 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 15:22:47.046888 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:47.046851 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:22:47.055810 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:47.055776 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:22:47.436433 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:47.436350 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:22:47.444071 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:47.444048 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:22:58.983235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:58.983199 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:22:58.983817 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:22:58.983493 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" containerID="cri-o://58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a" gracePeriod=30 Apr 16 15:23:00.136052 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.136021 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:23:00.226925 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.226810 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.226926 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.226966 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.227021 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.227047 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f76hh\" (UniqueName: \"kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227104 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.227070 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache\") pod \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\" (UID: \"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd\") " Apr 16 15:23:00.227647 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.227609 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home" (OuterVolumeSpecName: "home") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.227758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.227629 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache" (OuterVolumeSpecName: "model-cache") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.229260 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.229225 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:00.229365 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.229276 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm" (OuterVolumeSpecName: "dshm") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.229584 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.229556 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh" (OuterVolumeSpecName: "kube-api-access-f76hh") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "kube-api-access-f76hh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:00.286845 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.286791 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" (UID: "1894e0ef-e31e-46f5-9135-e94d5ab6a0cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.328207 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328165 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.328207 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328201 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.328207 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328212 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.328451 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328224 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f76hh\" (UniqueName: \"kubernetes.io/projected/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-kube-api-access-f76hh\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.328451 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328234 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.328451 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.328243 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.338853 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.338817 2579 generic.go:358] "Generic (PLEG): container finished" podID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerID="58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a" exitCode=0 Apr 16 15:23:00.339047 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.338923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerDied","Data":"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a"} Apr 16 15:23:00.339047 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.338972 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"1894e0ef-e31e-46f5-9135-e94d5ab6a0cd","Type":"ContainerDied","Data":"617f183e9887ee18a9876441f40a863ba73ae39834ce428b0c6a051301721fdb"} Apr 16 15:23:00.339047 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.338989 2579 scope.go:117] "RemoveContainer" containerID="58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a" Apr 16 15:23:00.339047 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.338934 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 15:23:00.361017 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.360981 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:23:00.364111 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.364079 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 15:23:00.368460 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.368440 2579 scope.go:117] "RemoveContainer" containerID="8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee" Apr 16 15:23:00.434239 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.434211 2579 scope.go:117] "RemoveContainer" containerID="58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a" Apr 16 15:23:00.434585 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:00.434564 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a\": container with ID starting with 58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a not found: ID does not exist" containerID="58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a" Apr 16 15:23:00.434646 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.434597 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a"} err="failed to get container status \"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a\": rpc error: code = NotFound desc = could not find container \"58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a\": container with ID starting with 58dd651830961ac77ea00e055a5d54c5503d31af7249514f712ca9f9c7ed7a9a not found: ID does not exist" Apr 16 15:23:00.434646 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.434622 2579 scope.go:117] "RemoveContainer" containerID="8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee" Apr 16 15:23:00.434956 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:00.434927 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee\": container with ID starting with 8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee not found: ID does not exist" containerID="8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee" Apr 16 15:23:00.435073 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:00.434957 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee"} err="failed to get container status \"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee\": rpc error: code = NotFound desc = could not find container \"8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee\": container with ID starting with 8e88074b38206d89e428198aa91c75e29df284c957afb43b969df06f7c7687ee not found: ID does not exist" Apr 16 15:23:01.357323 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:01.357253 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" path="/var/lib/kubelet/pods/1894e0ef-e31e-46f5-9135-e94d5ab6a0cd/volumes" Apr 16 15:23:17.559791 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:17.559758 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:23:17.560212 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:17.560090 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" containerID="cri-o://cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150" gracePeriod=30 Apr 16 15:23:17.563368 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:17.563343 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:23:17.564263 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:17.563739 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" containerID="cri-o://bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" gracePeriod=30 Apr 16 15:23:38.478081 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478044 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478326 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478337 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478359 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="storage-initializer" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478369 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="storage-initializer" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478380 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478387 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478397 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="storage-initializer" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478402 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="storage-initializer" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478451 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7641919-79c9-46bf-a715-0969c96cecd9" containerName="main" Apr 16 15:23:38.478472 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.478462 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1894e0ef-e31e-46f5-9135-e94d5ab6a0cd" containerName="main" Apr 16 15:23:38.481993 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.481970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.484642 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.484620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 15:23:38.492254 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.492227 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:23:38.566662 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.566837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmk4\" (UniqueName: \"kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.566837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.566837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.566837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.566837 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.566816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667323 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmk4\" (UniqueName: \"kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667507 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667611 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667657 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667700 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667740 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.667788 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.668003 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.667976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.669836 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.669812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.670040 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.670023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.674387 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.674370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmk4\" (UniqueName: \"kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4\") pod \"router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.792969 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.792936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:38.917003 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:38.916961 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:23:38.920012 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:23:38.919982 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a47ca8_0768_450c_ac73_ecf7f5bf152c.slice/crio-8a8d1b61af885b1b5cd5d3ac99ff612693df3e8d6e89f67d9e4bccf4443b14e6 WatchSource:0}: Error finding container 8a8d1b61af885b1b5cd5d3ac99ff612693df3e8d6e89f67d9e4bccf4443b14e6: Status 404 returned error can't find the container with id 8a8d1b61af885b1b5cd5d3ac99ff612693df3e8d6e89f67d9e4bccf4443b14e6 Apr 16 15:23:39.466235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:39.466199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerStarted","Data":"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990"} Apr 16 15:23:39.466235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:39.466238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerStarted","Data":"8a8d1b61af885b1b5cd5d3ac99ff612693df3e8d6e89f67d9e4bccf4443b14e6"} Apr 16 15:23:43.481694 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:43.481600 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerID="626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990" exitCode=0 Apr 16 15:23:43.482203 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:43.481682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerDied","Data":"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990"} Apr 16 15:23:44.486940 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:44.486883 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerStarted","Data":"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff"} Apr 16 15:23:44.507425 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:44.507361 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podStartSLOduration=6.507343261 podStartE2EDuration="6.507343261s" podCreationTimestamp="2026-04-16 15:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:23:44.505220376 +0000 UTC m=+1873.687758149" watchObservedRunningTime="2026-04-16 15:23:44.507343261 +0000 UTC m=+1873.689881040" Apr 16 15:23:47.564539 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:47.564490 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="llm-d-routing-sidecar" containerID="cri-o://906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" gracePeriod=2 Apr 16 15:23:47.953155 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:47.953129 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:23:47.956576 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:47.956548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4_97e99309-af9c-4aa1-a1ff-c22f2488ceab/main/0.log" Apr 16 15:23:47.957311 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:47.957290 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:23:48.046102 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046067 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046102 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046110 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046131 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5mlh\" (UniqueName: \"kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046180 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046205 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046223 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxrwt\" (UniqueName: \"kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046248 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046278 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046313 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046345 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs\") pod \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\" (UID: \"97e99309-af9c-4aa1-a1ff-c22f2488ceab\") " Apr 16 15:23:48.046756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046369 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046402 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm\") pod \"4d817844-eeaf-4e78-a00d-66aae2ac79af\" (UID: \"4d817844-eeaf-4e78-a00d-66aae2ac79af\") " Apr 16 15:23:48.046756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046452 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache" (OuterVolumeSpecName: "model-cache") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.046756 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046644 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.047002 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.046828 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home" (OuterVolumeSpecName: "home") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.049293 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049245 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:48.049426 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049308 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:48.049426 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049389 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt" (OuterVolumeSpecName: "kube-api-access-gxrwt") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "kube-api-access-gxrwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:48.049568 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049539 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache" (OuterVolumeSpecName: "model-cache") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.049755 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049733 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home" (OuterVolumeSpecName: "home") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.049916 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.049876 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh" (OuterVolumeSpecName: "kube-api-access-r5mlh") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "kube-api-access-r5mlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:48.050113 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.050084 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm" (OuterVolumeSpecName: "dshm") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.050914 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.050883 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm" (OuterVolumeSpecName: "dshm") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.106869 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.106779 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97e99309-af9c-4aa1-a1ff-c22f2488ceab" (UID: "97e99309-af9c-4aa1-a1ff-c22f2488ceab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.111120 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.111092 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d817844-eeaf-4e78-a00d-66aae2ac79af" (UID: "4d817844-eeaf-4e78-a00d-66aae2ac79af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:48.147264 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147226 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d817844-eeaf-4e78-a00d-66aae2ac79af-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147264 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147261 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5mlh\" (UniqueName: \"kubernetes.io/projected/4d817844-eeaf-4e78-a00d-66aae2ac79af-kube-api-access-r5mlh\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147264 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147271 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147281 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxrwt\" (UniqueName: \"kubernetes.io/projected/97e99309-af9c-4aa1-a1ff-c22f2488ceab-kube-api-access-gxrwt\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147291 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147300 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147307 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97e99309-af9c-4aa1-a1ff-c22f2488ceab-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147315 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97e99309-af9c-4aa1-a1ff-c22f2488ceab-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147324 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147331 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.147465 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.147339 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d817844-eeaf-4e78-a00d-66aae2ac79af-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:23:48.501359 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.501325 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerID="cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150" exitCode=137 Apr 16 15:23:48.501557 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.501376 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerDied","Data":"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150"} Apr 16 15:23:48.501557 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.501405 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" Apr 16 15:23:48.501557 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.501426 2579 scope.go:117] "RemoveContainer" containerID="cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150" Apr 16 15:23:48.501557 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.501412 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss" event={"ID":"4d817844-eeaf-4e78-a00d-66aae2ac79af","Type":"ContainerDied","Data":"6fae360fb4e9e651a78a96bbac1416f3273c8950373e5de44e01115229bc8539"} Apr 16 15:23:48.503014 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.502999 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4_97e99309-af9c-4aa1-a1ff-c22f2488ceab/main/0.log" Apr 16 15:23:48.503730 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503706 2579 generic.go:358] "Generic (PLEG): container finished" podID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerID="bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" exitCode=137 Apr 16 15:23:48.503730 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503728 2579 generic.go:358] "Generic (PLEG): container finished" podID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerID="906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" exitCode=0 Apr 16 15:23:48.503880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503797 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerDied","Data":"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba"} Apr 16 15:23:48.503880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerDied","Data":"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd"} Apr 16 15:23:48.503880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" event={"ID":"97e99309-af9c-4aa1-a1ff-c22f2488ceab","Type":"ContainerDied","Data":"f1f24bc2c2ddd892f399358ed53a3b9fb5a7a1bc563d2f7a0446bc3f514a4967"} Apr 16 15:23:48.503880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.503854 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4" Apr 16 15:23:48.524238 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.524210 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:23:48.525577 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.525554 2579 scope.go:117] "RemoveContainer" containerID="a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f" Apr 16 15:23:48.527140 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.527116 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7b8478db74-lffss"] Apr 16 15:23:48.537204 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.537172 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:23:48.541326 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.541296 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-ff85f4ccc-clxp4"] Apr 16 15:23:48.590689 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.590504 2579 scope.go:117] "RemoveContainer" containerID="cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150" Apr 16 15:23:48.591036 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:48.590846 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150\": container with ID starting with cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150 not found: ID does not exist" containerID="cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150" Apr 16 15:23:48.591036 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.590876 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150"} err="failed to get container status \"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150\": rpc error: code = NotFound desc = could not find container \"cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150\": container with ID starting with cab518df837157361024db348ed82dd7c7754176f4f2e47f650d8e5ca75df150 not found: ID does not exist" Apr 16 15:23:48.591036 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.590896 2579 scope.go:117] "RemoveContainer" containerID="a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f" Apr 16 15:23:48.591219 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:48.591190 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f\": container with ID starting with a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f not found: ID does not exist" containerID="a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f" Apr 16 15:23:48.591282 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.591217 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f"} err="failed to get container status \"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f\": rpc error: code = NotFound desc = could not find container \"a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f\": container with ID starting with a6bce07cd908b80e8029e5e20291fb6733cf57999cfe6aa9e79d40272ce36f9f not found: ID does not exist" Apr 16 15:23:48.591282 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.591236 2579 scope.go:117] "RemoveContainer" containerID="bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" Apr 16 15:23:48.611833 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.611804 2579 scope.go:117] "RemoveContainer" containerID="96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2" Apr 16 15:23:48.678391 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.678365 2579 scope.go:117] "RemoveContainer" containerID="906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" Apr 16 15:23:48.687072 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687053 2579 scope.go:117] "RemoveContainer" containerID="bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" Apr 16 15:23:48.687316 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:48.687299 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba\": container with ID starting with bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba not found: ID does not exist" containerID="bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" Apr 16 15:23:48.687374 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687323 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba"} err="failed to get container status \"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba\": rpc error: code = NotFound desc = could not find container \"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba\": container with ID starting with bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba not found: ID does not exist" Apr 16 15:23:48.687374 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687342 2579 scope.go:117] "RemoveContainer" containerID="96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2" Apr 16 15:23:48.687559 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:48.687540 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2\": container with ID starting with 96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2 not found: ID does not exist" containerID="96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2" Apr 16 15:23:48.687623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687570 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2"} err="failed to get container status \"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2\": rpc error: code = NotFound desc = could not find container \"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2\": container with ID starting with 96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2 not found: ID does not exist" Apr 16 15:23:48.687623 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687594 2579 scope.go:117] "RemoveContainer" containerID="906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" Apr 16 15:23:48.687879 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:23:48.687856 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd\": container with ID starting with 906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd not found: ID does not exist" containerID="906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" Apr 16 15:23:48.688242 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.687889 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd"} err="failed to get container status \"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd\": rpc error: code = NotFound desc = could not find container \"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd\": container with ID starting with 906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd not found: ID does not exist" Apr 16 15:23:48.688295 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.688250 2579 scope.go:117] "RemoveContainer" containerID="bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba" Apr 16 15:23:48.688513 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.688492 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba"} err="failed to get container status \"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba\": rpc error: code = NotFound desc = could not find container \"bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba\": container with ID starting with bb7b8dc8192f550dc6cc8bff7b14ce1cd7fc6d96482b17c1bb54f3ffae1672ba not found: ID does not exist" Apr 16 15:23:48.688588 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.688514 2579 scope.go:117] "RemoveContainer" containerID="96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2" Apr 16 15:23:48.688779 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.688751 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2"} err="failed to get container status \"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2\": rpc error: code = NotFound desc = could not find container \"96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2\": container with ID starting with 96c416e887c2d89b369454576ab057736925a7fda4c99a647b53dbc541282dd2 not found: ID does not exist" Apr 16 15:23:48.688827 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.688780 2579 scope.go:117] "RemoveContainer" containerID="906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd" Apr 16 15:23:48.689111 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.689094 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd"} err="failed to get container status \"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd\": rpc error: code = NotFound desc = could not find container \"906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd\": container with ID starting with 906d9df5b7ffa2df1da9ecb4a1f97c2f09c14c078c63c5a40a227533c0b297cd not found: ID does not exist" Apr 16 15:23:48.793782 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.793684 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:48.793782 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.793739 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:23:48.795375 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:48.795344 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:23:49.355735 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:49.355701 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" path="/var/lib/kubelet/pods/4d817844-eeaf-4e78-a00d-66aae2ac79af/volumes" Apr 16 15:23:49.356167 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:49.356153 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" path="/var/lib/kubelet/pods/97e99309-af9c-4aa1-a1ff-c22f2488ceab/volumes" Apr 16 15:23:58.794341 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:23:58.794299 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:08.794466 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:08.794416 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:18.793681 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:18.793633 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:28.793772 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:28.793730 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:38.794226 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:38.794176 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:48.794533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:48.794479 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:24:58.793512 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:24:58.793457 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:25:08.794448 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:08.794391 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 15:25:18.804060 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:18.804021 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:25:18.815358 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:18.815319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:25:30.212430 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:30.212386 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:25:30.212897 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:30.212680 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" containerID="cri-o://da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff" gracePeriod=30 Apr 16 15:25:45.428562 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:45.428527 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:45.532289 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:45.532255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:45.538416 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:45.538383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:46.469140 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:46.469104 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:46.539554 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:46.539519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:46.546181 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:46.546157 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:47.457374 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:47.457323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:47.524460 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:47.524436 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:47.530548 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:47.530521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:48.438880 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:48.438850 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:48.506395 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:48.506362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:48.511952 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:48.511928 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:49.397609 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:49.397580 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:49.478118 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:49.478081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:49.486508 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:49.486481 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:50.370235 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:50.370192 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:50.442546 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:50.442504 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:50.448378 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:50.448356 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:51.413898 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:51.413864 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:51.484445 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:51.484418 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:51.490222 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:51.490202 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:52.398625 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:52.398597 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:52.480448 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:52.480413 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:52.486461 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:52.486437 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:53.368949 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:53.368917 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:53.449960 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:53.449927 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:53.458803 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:53.458774 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:54.357250 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:54.357220 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:54.430138 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:54.430103 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:54.437534 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:54.437506 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:55.333751 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:55.333719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:55.418662 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:55.418628 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:55.428200 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:55.428171 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:56.342114 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:56.342083 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:56.417601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:56.417573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:56.425033 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:56.425010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:57.404143 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:57.404105 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:57.485695 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:57.485662 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:57.492749 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:57.492721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:25:58.442775 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:58.442743 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-x74z6_6b6be884-0cea-4117-b0fb-ecbaae0e7486/istio-proxy/0.log" Apr 16 15:25:58.517426 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:58.517395 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/main/0.log" Apr 16 15:25:58.523266 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:25:58.523233 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs_f1a47ca8-0768-450c-ac73-ecf7f5bf152c/storage-initializer/0.log" Apr 16 15:26:00.466476 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.466409 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:26:00.508173 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508138 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508381 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508216 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508381 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508253 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508381 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508286 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmk4\" (UniqueName: \"kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508381 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508343 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508396 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache\") pod \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\" (UID: \"f1a47ca8-0768-450c-ac73-ecf7f5bf152c\") " Apr 16 15:26:00.508583 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508565 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home" (OuterVolumeSpecName: "home") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.508771 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.508733 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache" (OuterVolumeSpecName: "model-cache") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.510490 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.510458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:26:00.510971 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.510940 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4" (OuterVolumeSpecName: "kube-api-access-jsmk4") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "kube-api-access-jsmk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:26:00.511046 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.511008 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm" (OuterVolumeSpecName: "dshm") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.576323 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.576275 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1a47ca8-0768-450c-ac73-ecf7f5bf152c" (UID: "f1a47ca8-0768-450c-ac73-ecf7f5bf152c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.608939 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608884 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsmk4\" (UniqueName: \"kubernetes.io/projected/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kube-api-access-jsmk4\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.608939 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608937 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-dshm\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.609109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608949 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-model-cache\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.609109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608959 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-home\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.609109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608967 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-kserve-provision-location\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.609109 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.608975 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a47ca8-0768-450c-ac73-ecf7f5bf152c-tls-certs\") on node \"ip-10-0-139-47.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.947552 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.947508 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerID="da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff" exitCode=137 Apr 16 15:26:00.947758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.947587 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" Apr 16 15:26:00.947758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.947633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerDied","Data":"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff"} Apr 16 15:26:00.947758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.947685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs" event={"ID":"f1a47ca8-0768-450c-ac73-ecf7f5bf152c","Type":"ContainerDied","Data":"8a8d1b61af885b1b5cd5d3ac99ff612693df3e8d6e89f67d9e4bccf4443b14e6"} Apr 16 15:26:00.947758 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.947708 2579 scope.go:117] "RemoveContainer" containerID="da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff" Apr 16 15:26:00.969543 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.969505 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:26:00.970038 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.970013 2579 scope.go:117] "RemoveContainer" containerID="626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990" Apr 16 15:26:00.972751 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.972729 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-74cc998fdb-ch4xs"] Apr 16 15:26:00.981768 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.981745 2579 scope.go:117] "RemoveContainer" containerID="da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff" Apr 16 15:26:00.982155 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:26:00.982131 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff\": container with ID starting with da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff not found: ID does not exist" containerID="da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff" Apr 16 15:26:00.982285 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.982166 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff"} err="failed to get container status \"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff\": rpc error: code = NotFound desc = could not find container \"da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff\": container with ID starting with da9c2a886c9767521821fa2e1ef8db93cdbe6e2bb1944b79e5b54bc0db7c0aff not found: ID does not exist" Apr 16 15:26:00.982285 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.982192 2579 scope.go:117] "RemoveContainer" containerID="626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990" Apr 16 15:26:00.982550 ip-10-0-139-47 kubenswrapper[2579]: E0416 15:26:00.982518 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990\": container with ID starting with 626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990 not found: ID does not exist" containerID="626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990" Apr 16 15:26:00.982629 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:00.982562 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990"} err="failed to get container status \"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990\": rpc error: code = NotFound desc = could not find container \"626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990\": container with ID starting with 626130f608602bcd778c651c111374e05f192d0de5f980a2e44448c106f3f990 not found: ID does not exist" Apr 16 15:26:01.005477 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:01.005440 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xb67r_6e2062c3-c136-4b9a-a040-c9f6c9c23a3c/manager/0.log" Apr 16 15:26:01.354177 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:01.354144 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" path="/var/lib/kubelet/pods/f1a47ca8-0768-450c-ac73-ecf7f5bf152c/volumes" Apr 16 15:26:03.328816 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.328778 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5p4n/must-gather-7qmtw"] Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329091 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329105 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329115 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329121 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329131 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329137 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="storage-initializer" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329145 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329150 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329159 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="llm-d-routing-sidecar" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329167 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="llm-d-routing-sidecar" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329177 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329182 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329190 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" Apr 16 15:26:03.329221 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329195 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" Apr 16 15:26:03.329640 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329246 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="main" Apr 16 15:26:03.329640 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329258 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d817844-eeaf-4e78-a00d-66aae2ac79af" containerName="main" Apr 16 15:26:03.329640 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329265 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97e99309-af9c-4aa1-a1ff-c22f2488ceab" containerName="llm-d-routing-sidecar" Apr 16 15:26:03.329640 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.329270 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a47ca8-0768-450c-ac73-ecf7f5bf152c" containerName="main" Apr 16 15:26:03.332458 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.332436 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.335628 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.335609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"openshift-service-ca.crt\"" Apr 16 15:26:03.335728 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.335609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"kube-root-ca.crt\"" Apr 16 15:26:03.336445 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.336428 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5p4n\"/\"default-dockercfg-fv9wn\"" Apr 16 15:26:03.343634 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.343615 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/must-gather-7qmtw"] Apr 16 15:26:03.429940 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.429881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33291542-fa59-4a1b-a78c-165c36614b8e-must-gather-output\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.430139 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.429978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldnb\" (UniqueName: \"kubernetes.io/projected/33291542-fa59-4a1b-a78c-165c36614b8e-kube-api-access-4ldnb\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.530518 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.530475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33291542-fa59-4a1b-a78c-165c36614b8e-must-gather-output\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.530690 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.530560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldnb\" (UniqueName: \"kubernetes.io/projected/33291542-fa59-4a1b-a78c-165c36614b8e-kube-api-access-4ldnb\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.530844 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.530825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33291542-fa59-4a1b-a78c-165c36614b8e-must-gather-output\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.540612 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.540576 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldnb\" (UniqueName: \"kubernetes.io/projected/33291542-fa59-4a1b-a78c-165c36614b8e-kube-api-access-4ldnb\") pod \"must-gather-7qmtw\" (UID: \"33291542-fa59-4a1b-a78c-165c36614b8e\") " pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.641866 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.641777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" Apr 16 15:26:03.762274 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.762236 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/must-gather-7qmtw"] Apr 16 15:26:03.766488 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:26:03.766454 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33291542_fa59_4a1b_a78c_165c36614b8e.slice/crio-db4d15bd56a6f42518e7c25f1cafee7816b17eacabb50f311a80164986e5ad75 WatchSource:0}: Error finding container db4d15bd56a6f42518e7c25f1cafee7816b17eacabb50f311a80164986e5ad75: Status 404 returned error can't find the container with id db4d15bd56a6f42518e7c25f1cafee7816b17eacabb50f311a80164986e5ad75 Apr 16 15:26:03.768194 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.768176 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:26:03.959405 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:03.959314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" event={"ID":"33291542-fa59-4a1b-a78c-165c36614b8e","Type":"ContainerStarted","Data":"db4d15bd56a6f42518e7c25f1cafee7816b17eacabb50f311a80164986e5ad75"} Apr 16 15:26:04.965818 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:04.965703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" event={"ID":"33291542-fa59-4a1b-a78c-165c36614b8e","Type":"ContainerStarted","Data":"5c78dad3a1236e5e63ecad236d7b7fbfe8cc65d461afa9ee4cb6e57755234ea9"} Apr 16 15:26:04.965818 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:04.965757 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" event={"ID":"33291542-fa59-4a1b-a78c-165c36614b8e","Type":"ContainerStarted","Data":"9f1d618ded08b2edde0512ecd217d36bdba8f278c23e99ebe7d034be7d428d1d"} Apr 16 15:26:04.982552 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:04.981581 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5p4n/must-gather-7qmtw" podStartSLOduration=1.100122972 podStartE2EDuration="1.981560842s" podCreationTimestamp="2026-04-16 15:26:03 +0000 UTC" firstStartedPulling="2026-04-16 15:26:03.768302479 +0000 UTC m=+2012.950840232" lastFinishedPulling="2026-04-16 15:26:04.649740335 +0000 UTC m=+2013.832278102" observedRunningTime="2026-04-16 15:26:04.98033427 +0000 UTC m=+2014.162872041" watchObservedRunningTime="2026-04-16 15:26:04.981560842 +0000 UTC m=+2014.164098618" Apr 16 15:26:06.360723 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:06.360683 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m9n7j_aee05764-3834-4ece-aff3-6c244c99378a/global-pull-secret-syncer/0.log" Apr 16 15:26:06.400770 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:06.400737 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8jpq8_aabf9cf2-dcc2-4a42-b211-e34be354e3ed/konnectivity-agent/0.log" Apr 16 15:26:06.495183 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:06.495155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-47.ec2.internal_da3e9a6a335603c20524d6311693f61c/haproxy/0.log" Apr 16 15:26:10.681784 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:10.681749 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xb67r_6e2062c3-c136-4b9a-a040-c9f6c9c23a3c/manager/0.log" Apr 16 15:26:12.010785 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:12.010744 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpjvx_1a854354-6033-440e-91bf-c5287d20c508/node-exporter/0.log" Apr 16 15:26:12.034527 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:12.034486 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpjvx_1a854354-6033-440e-91bf-c5287d20c508/kube-rbac-proxy/0.log" Apr 16 15:26:12.053594 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:12.053568 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpjvx_1a854354-6033-440e-91bf-c5287d20c508/init-textfile/0.log" Apr 16 15:26:14.079748 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:14.079721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-7st9s_26a2efd2-c6ce-46da-b1fe-6a739a8edd83/networking-console-plugin/0.log" Apr 16 15:26:15.510680 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.510641 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7"] Apr 16 15:26:15.517580 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.517542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.520080 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.520051 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7"] Apr 16 15:26:15.642246 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.642209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-proc\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.642440 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.642263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-lib-modules\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.642440 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.642343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx6h6\" (UniqueName: \"kubernetes.io/projected/2622e622-2cab-4d97-a576-675313aed895-kube-api-access-fx6h6\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.642440 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.642388 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-sys\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.642440 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.642416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-podres\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.743855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-podres\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.743967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-proc\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-lib-modules\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx6h6\" (UniqueName: \"kubernetes.io/projected/2622e622-2cab-4d97-a576-675313aed895-kube-api-access-fx6h6\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-sys\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-sys\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-podres\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-proc\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.744601 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.744461 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2622e622-2cab-4d97-a576-675313aed895-lib-modules\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.753001 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.752964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx6h6\" (UniqueName: \"kubernetes.io/projected/2622e622-2cab-4d97-a576-675313aed895-kube-api-access-fx6h6\") pod \"perf-node-gather-daemonset-cnkx7\" (UID: \"2622e622-2cab-4d97-a576-675313aed895\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.830164 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.830065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:15.978318 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:15.978256 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7"] Apr 16 15:26:15.984086 ip-10-0-139-47 kubenswrapper[2579]: W0416 15:26:15.984034 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2622e622_2cab_4d97_a576_675313aed895.slice/crio-200da2b9234ce0f1aa63d8e7e51dd7c66bb9cd385ac5f251431703267c283cdc WatchSource:0}: Error finding container 200da2b9234ce0f1aa63d8e7e51dd7c66bb9cd385ac5f251431703267c283cdc: Status 404 returned error can't find the container with id 200da2b9234ce0f1aa63d8e7e51dd7c66bb9cd385ac5f251431703267c283cdc Apr 16 15:26:16.016772 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:16.016732 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" event={"ID":"2622e622-2cab-4d97-a576-675313aed895","Type":"ContainerStarted","Data":"200da2b9234ce0f1aa63d8e7e51dd7c66bb9cd385ac5f251431703267c283cdc"} Apr 16 15:26:16.295160 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:16.295123 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hfndr_545cbae5-e741-4260-bd9d-d9cf35cd5a5a/dns/0.log" Apr 16 15:26:16.313095 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:16.313066 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hfndr_545cbae5-e741-4260-bd9d-d9cf35cd5a5a/kube-rbac-proxy/0.log" Apr 16 15:26:16.373416 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:16.373340 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mvlp4_c0f8acc5-b88a-4adf-b2de-441b178001bf/dns-node-resolver/0.log" Apr 16 15:26:16.905093 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:16.905042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dl2m4_da8fe018-bd6c-490e-9197-d5ea9c881a92/node-ca/0.log" Apr 16 15:26:17.024538 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:17.024494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" event={"ID":"2622e622-2cab-4d97-a576-675313aed895","Type":"ContainerStarted","Data":"13e7af46f1a668c7a52999edb3cbebcd137b24c6a84c7ae952b5b2ea7b582e11"} Apr 16 15:26:17.024764 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:17.024747 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:17.044533 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:17.044473 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" podStartSLOduration=2.044454018 podStartE2EDuration="2.044454018s" podCreationTimestamp="2026-04-16 15:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:17.043518065 +0000 UTC m=+2026.226055840" watchObservedRunningTime="2026-04-16 15:26:17.044454018 +0000 UTC m=+2026.226991792" Apr 16 15:26:18.219286 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:18.219259 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kd7rp_91fb67d9-4995-4dfe-bc80-067575a9d732/serve-healthcheck-canary/0.log" Apr 16 15:26:18.768128 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:18.768101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ktl5m_0c829937-28c4-4030-9425-91eda92303e3/kube-rbac-proxy/0.log" Apr 16 15:26:18.786035 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:18.786002 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ktl5m_0c829937-28c4-4030-9425-91eda92303e3/exporter/0.log" Apr 16 15:26:18.804896 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:18.804870 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ktl5m_0c829937-28c4-4030-9425-91eda92303e3/extractor/0.log" Apr 16 15:26:21.287919 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:21.287874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bfb495c77-6ljq6_30a313db-687c-4458-9be0-9d0c90a93236/manager/0.log" Apr 16 15:26:22.281040 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:22.281003 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-48t7w_4af35d6f-d455-4a38-9e0d-561821b8e721/manager/0.log" Apr 16 15:26:22.298360 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:22.298326 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-j9trf_ccba7863-bc8c-4937-b1f7-fd72d717b3f6/s3-init/0.log" Apr 16 15:26:24.042633 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:24.042605 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-cnkx7" Apr 16 15:26:28.159637 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.159604 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/kube-multus-additional-cni-plugins/0.log" Apr 16 15:26:28.177439 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.177406 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/egress-router-binary-copy/0.log" Apr 16 15:26:28.198944 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.198827 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/cni-plugins/0.log" Apr 16 15:26:28.217923 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.217879 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/bond-cni-plugin/0.log" Apr 16 15:26:28.240396 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.240363 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/routeoverride-cni/0.log" Apr 16 15:26:28.260377 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.260349 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/whereabouts-cni-bincopy/0.log" Apr 16 15:26:28.279818 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.279790 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dtvll_b9885cab-5589-4597-9b6e-6ef20f7bc2b2/whereabouts-cni/0.log" Apr 16 15:26:28.467808 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.467778 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-flwwz_16b9981c-80e1-46aa-90b5-651d968a8850/kube-multus/0.log" Apr 16 15:26:28.591284 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.591209 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vpmk7_ba7ca911-2481-4fe7-9079-e770b1840406/network-metrics-daemon/0.log" Apr 16 15:26:28.606642 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:28.606616 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vpmk7_ba7ca911-2481-4fe7-9079-e770b1840406/kube-rbac-proxy/0.log" Apr 16 15:26:29.864460 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.864428 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/ovn-controller/0.log" Apr 16 15:26:29.900008 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.899981 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/ovn-acl-logging/0.log" Apr 16 15:26:29.922071 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.922042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/kube-rbac-proxy-node/0.log" Apr 16 15:26:29.942730 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.942700 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:26:29.957844 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.957815 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/northd/0.log" Apr 16 15:26:29.975559 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.975523 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/nbdb/0.log" Apr 16 15:26:29.995776 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:29.995740 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/sbdb/0.log" Apr 16 15:26:30.197240 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:30.197157 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrxcw_ac641325-3949-4503-86d7-77e5ededa110/ovnkube-controller/0.log" Apr 16 15:26:31.289384 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:31.289352 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qh75b_1609e3b0-3601-4e3b-bb3c-b89f06a20319/network-check-target-container/0.log" Apr 16 15:26:32.408528 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:32.408501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nsdp5_15789381-f4bd-4597-8239-7c3b98a87e12/iptables-alerter/0.log" Apr 16 15:26:33.127813 ip-10-0-139-47 kubenswrapper[2579]: I0416 15:26:33.127775 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-96845_f9f0740c-e453-47f4-a40d-564dd1731056/tuned/0.log"