Apr 16 16:47:54.742698 ip-10-0-138-58 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:47:55.194830 ip-10-0-138-58 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:55.194830 ip-10-0-138-58 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:47:55.194830 ip-10-0-138-58 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:55.194830 ip-10-0-138-58 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:47:55.194830 ip-10-0-138-58 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:55.195838 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.195759 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:47:55.197858 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197843 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:55.197858 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197858 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197861 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197865 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197868 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197871 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197875 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197880 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197883 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197887 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197890 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197893 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197895 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197898 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197901 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197904 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197906 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197909 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197912 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197915 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197917 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:55.197921 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197920 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197923 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197926 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197928 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197931 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197934 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197937 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197939 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197942 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197944 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197947 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197950 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197953 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197955 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197959 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197963 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197965 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197968 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197971 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:55.198406 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197973 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197976 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197978 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197981 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197983 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197986 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197988 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197991 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197993 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197996 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.197998 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198001 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198003 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198006 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198008 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198011 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198014 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198017 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198021 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198023 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:55.198895 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198026 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198029 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198031 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198034 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198037 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198040 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198042 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198045 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198047 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198050 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198053 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198055 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198058 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198060 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198064 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198066 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198069 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198071 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198074 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:55.199375 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198076 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198079 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198082 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198084 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198087 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198089 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198092 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198463 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198469 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198472 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198475 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198477 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198480 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198483 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198486 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198488 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198491 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198494 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198497 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198499 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:55.199886 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198502 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198504 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198507 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198510 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198512 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198515 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198517 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198520 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198523 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198526 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198530 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198533 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198536 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198539 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198541 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198544 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198546 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198549 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198551 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:55.200360 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198554 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198557 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198560 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198563 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198566 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198568 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198571 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198574 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198576 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198579 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198582 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198585 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198587 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198590 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198592 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198595 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198598 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198600 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198603 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198607 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:55.201200 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198610 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198613 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198616 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198619 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198621 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198624 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198627 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198629 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198632 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198634 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198637 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198640 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198642 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198646 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198649 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198652 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198654 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198657 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198660 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198663 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:55.201793 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198666 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198668 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198671 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198674 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198677 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198679 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198682 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198684 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198687 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198690 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198692 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198695 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198697 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.198700 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199947 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199957 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199964 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199968 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199973 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199976 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199980 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:47:55.202282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199984 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199988 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199991 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199995 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.199998 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200001 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200004 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200007 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200010 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200013 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200016 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200019 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200023 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200026 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200029 2573 flags.go:64] FLAG: --config-dir="" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200032 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200036 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200039 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200042 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200045 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200049 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200052 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200055 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200058 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200061 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:47:55.202804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200064 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200069 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200072 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200074 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200077 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200080 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200083 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200087 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200090 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200093 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200097 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200100 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200103 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200106 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200109 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200112 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200115 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200118 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200121 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200124 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200127 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200130 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200133 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200136 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200139 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:47:55.203411 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200143 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200146 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200149 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200152 2573 flags.go:64] FLAG: --help="false" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200155 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200158 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200161 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200164 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200168 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200171 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200174 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200177 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200180 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200183 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200186 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200189 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200192 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200194 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200197 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200200 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200203 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200206 2573 flags.go:64] FLAG: --lock-file="" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200209 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200212 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:47:55.204017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200215 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200220 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200223 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200225 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200228 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200231 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200234 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200237 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200241 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200245 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200250 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200254 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200257 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200260 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200263 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200266 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200269 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200272 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200275 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200282 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200286 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200288 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200291 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:47:55.204631 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200294 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200299 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200302 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200305 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200308 2573 flags.go:64] FLAG: --port="10250" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200311 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200314 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00cb95a3e65f92e40" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200317 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200320 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200323 2573 flags.go:64] FLAG: --register-node="true" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200326 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200329 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200332 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200335 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200338 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200341 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200344 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200347 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200350 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200355 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200357 2573 flags.go:64] FLAG: --runonce="false" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200360 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200364 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200367 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200370 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200373 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:47:55.205188 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200390 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200393 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200396 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200399 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200402 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200405 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200408 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200411 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200414 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200417 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200422 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200425 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200427 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200431 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200433 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200436 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200439 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200442 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200445 2573 flags.go:64] FLAG: --v="2" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200448 2573 flags.go:64] FLAG: --version="false" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200452 2573 flags.go:64] FLAG: --vmodule="" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200456 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.200459 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201829 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201843 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:55.205828 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201847 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201851 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201853 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201856 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201859 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201863 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201865 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201868 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201871 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201874 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201877 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201880 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201883 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201886 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201888 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201891 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201894 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201896 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201899 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201902 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:55.206483 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201904 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201907 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201910 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201912 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201916 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201918 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201921 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201924 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201927 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201930 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201932 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201937 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201949 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201952 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201955 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201957 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201960 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201963 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201966 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201968 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:55.206980 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201971 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201974 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201977 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201979 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201983 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201985 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201988 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201991 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201994 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201996 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.201999 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202001 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202004 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202006 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202011 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202015 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202018 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202021 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202024 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:55.207531 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202027 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202030 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202033 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202035 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202041 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202044 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202046 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202050 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202054 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202057 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202059 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202062 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202065 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202067 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202070 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202073 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202075 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202080 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202082 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202085 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:55.208023 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202088 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:55.208541 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202090 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:55.208541 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202093 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:55.208541 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202096 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:55.208541 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.202098 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:55.208541 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.202792 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:55.209689 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.209589 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:47:55.209726 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.209691 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209736 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209742 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209745 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209749 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209752 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209755 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209758 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:55.209757 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209761 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209765 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209768 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209770 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209773 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209775 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209778 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209781 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209783 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209786 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209788 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209791 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209793 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209796 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209799 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209802 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209805 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209808 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209812 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209815 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:55.209958 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209818 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209820 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209823 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209826 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209828 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209831 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209833 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209836 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209838 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209841 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209843 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209846 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209848 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209851 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209855 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209857 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209860 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209862 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209865 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:55.210465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209868 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209870 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209873 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209876 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209878 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209881 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209883 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209886 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209889 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209893 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209897 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209901 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209904 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209907 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209910 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209912 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209915 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209918 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209921 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209923 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:55.210959 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209926 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209928 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209931 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209934 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209936 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209939 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209941 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209944 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209947 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209949 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209952 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209955 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209957 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209960 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209963 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209965 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209968 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209971 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209973 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:55.211465 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.209976 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.209981 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210076 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210080 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210083 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210086 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210089 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210092 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210095 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210097 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210100 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210102 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210105 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210108 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210110 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210113 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:55.211942 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210115 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210118 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210122 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210126 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210129 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210133 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210136 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210139 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210142 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210144 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210147 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210149 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210152 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210154 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210156 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210159 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210161 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210164 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210166 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:55.212339 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210169 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210172 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210175 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210177 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210179 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210182 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210184 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210187 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210189 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210192 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210195 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210198 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210200 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210203 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210205 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210208 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210210 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210213 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210216 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210218 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:55.212818 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210221 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210223 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210226 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210228 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210231 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210233 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210236 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210238 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210241 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210243 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210246 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210248 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210251 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210254 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210256 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210259 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210261 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210264 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210266 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210269 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:55.213304 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210271 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210274 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210277 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210279 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210282 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210284 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210287 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210289 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210292 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210295 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210297 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210300 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:55.210302 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.210307 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:55.213849 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.210412 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:47:55.216545 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.216532 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:47:55.217544 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.217533 2573 server.go:1019] "Starting client certificate rotation" Apr 16 16:47:55.217653 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.217636 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:55.217717 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.217679 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:55.245599 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.245581 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:55.247268 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.247243 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:55.267533 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.267515 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:47:55.272195 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.272181 2573 log.go:25] "Validated CRI v1 image API" Apr 16 16:47:55.273246 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.273231 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:47:55.276156 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.276135 2573 fs.go:135] Filesystem UUIDs: map[0e5f192e-53ab-484d-8a82-7d3407807638:/dev/nvme0n1p4 11751525-5078-49f4-83ed-e6d0bc605056:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 16:47:55.276215 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.276155 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:47:55.279475 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.279456 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:55.283703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.283585 2573 manager.go:217] Machine: {Timestamp:2026-04-16 16:47:55.281757489 +0000 UTC m=+0.422476223 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100212 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec235c94f58b7b6561d09dadf6570ed1 SystemUUID:ec235c94-f58b-7b65-61d0-9dadf6570ed1 BootID:3cf36ef2-e850-42b6-b3ca-998dc66555e4 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:91:8a:6c:fe:4f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:91:8a:6c:fe:4f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:ce:ff:b7:97:7e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:47:55.283703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.283692 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:47:55.283820 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.283788 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:47:55.286205 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.286185 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:47:55.286329 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.286207 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-58.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:47:55.286370 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.286338 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:47:55.286370 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.286346 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:47:55.286370 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.286359 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:55.287100 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.287088 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:55.288222 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.288212 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:55.288331 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.288322 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:47:55.290670 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.290661 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:47:55.290700 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.290674 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:47:55.290700 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.290685 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:47:55.290700 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.290694 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:47:55.290798 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.290702 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:47:55.291818 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.291807 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:55.291857 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.291824 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:55.297313 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.297292 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:47:55.299478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.299465 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:47:55.300719 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300707 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300723 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300729 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300734 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300739 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300745 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300750 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300755 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300762 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300768 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300776 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:47:55.300787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.300784 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:47:55.301671 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.301660 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:47:55.301671 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.301670 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:47:55.303601 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.303578 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-58.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:47:55.303601 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.303577 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:47:55.305259 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.305246 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:47:55.305301 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.305287 2573 server.go:1295] "Started kubelet" Apr 16 16:47:55.305433 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.305369 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:47:55.305494 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.305456 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:47:55.305550 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.305527 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:47:55.305958 ip-10-0-138-58 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:47:55.306531 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.306517 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:47:55.308137 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.308121 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:47:55.312331 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.312316 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:47:55.312431 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.312334 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:55.313093 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.312967 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:47:55.313181 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313149 2573 factory.go:55] Registering systemd factory Apr 16 16:47:55.313181 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313163 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:47:55.313320 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.313299 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.313504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313490 2573 factory.go:153] Registering CRI-O factory Apr 16 16:47:55.313568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313509 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 16:47:55.313615 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313014 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:47:55.313615 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313600 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:47:55.313615 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313588 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:47:55.313703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313640 2573 factory.go:103] Registering Raw factory Apr 16 16:47:55.313703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313663 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 16:47:55.313703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313696 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:47:55.313703 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.313702 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:47:55.314034 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.314022 2573 manager.go:319] Starting recovery of all containers Apr 16 16:47:55.314310 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.314290 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:47:55.317963 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.317932 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-58.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:47:55.318260 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.318235 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:47:55.318326 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.318294 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-58.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:47:55.320138 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.318486 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-58.ec2.internal.18a6e4420877726d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-58.ec2.internal,UID:ip-10-0-138-58.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-58.ec2.internal,},FirstTimestamp:2026-04-16 16:47:55.305259629 +0000 UTC m=+0.445978365,LastTimestamp:2026-04-16 16:47:55.305259629 +0000 UTC m=+0.445978365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-58.ec2.internal,}" Apr 16 16:47:55.324486 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.324462 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m4ls8" Apr 16 16:47:55.327162 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.327144 2573 manager.go:324] Recovery completed Apr 16 16:47:55.330907 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.330891 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.332994 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.332976 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m4ls8" Apr 16 16:47:55.333133 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333119 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.333171 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333146 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.333171 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333158 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.333637 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333624 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:47:55.333637 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333635 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:47:55.333738 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.333665 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:55.334804 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.334747 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-58.ec2.internal.18a6e4420a20c2a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-58.ec2.internal,UID:ip-10-0-138-58.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-58.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-58.ec2.internal,},FirstTimestamp:2026-04-16 16:47:55.333132963 +0000 UTC m=+0.473851695,LastTimestamp:2026-04-16 16:47:55.333132963 +0000 UTC m=+0.473851695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-58.ec2.internal,}" Apr 16 16:47:55.335995 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.335983 2573 policy_none.go:49] "None policy: Start" Apr 16 16:47:55.336029 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.335999 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:47:55.336029 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.336009 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375559 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.375607 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375615 2573 server.go:85] "Starting device plugin registration server" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375825 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375834 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375913 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375973 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.375981 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.376595 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:47:55.389474 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.376631 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.414980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.414961 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:47:55.416121 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.416105 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:47:55.416196 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.416129 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:47:55.416196 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.416145 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:47:55.416196 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.416153 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:47:55.416324 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.416204 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:47:55.419039 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.419017 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:55.476493 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.476449 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.477471 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.477458 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.477540 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.477484 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.477540 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.477500 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.477540 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.477536 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.486247 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.486235 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.486294 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.486253 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-58.ec2.internal\": node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.510603 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.510583 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.516984 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.516963 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal"] Apr 16 16:47:55.517037 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.517018 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.517669 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.517655 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.517735 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.517683 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.517735 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.517695 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.518840 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.518827 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.519001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.518987 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.519061 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519020 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.519493 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519478 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.519568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519483 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.519568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519504 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.519568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519514 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.519568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519525 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.519568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.519541 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.521135 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.521120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.521213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.521147 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:55.521759 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.521746 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:55.521828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.521772 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:55.521828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.521783 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:55.539917 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.539900 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-58.ec2.internal\" not found" node="ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.543611 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.543597 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-58.ec2.internal\" not found" node="ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.611221 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.611201 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.615602 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.615586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.615677 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.615610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.615677 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.615627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43c81a0a9a01b0c05b287312dc013cbc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-58.ec2.internal\" (UID: \"43c81a0a9a01b0c05b287312dc013cbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.712029 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.712000 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.716291 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.716343 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.716343 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43c81a0a9a01b0c05b287312dc013cbc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-58.ec2.internal\" (UID: \"43c81a0a9a01b0c05b287312dc013cbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.716433 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.716433 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82016c9c370b8a67ea924de37e34c743-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal\" (UID: \"82016c9c370b8a67ea924de37e34c743\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.716433 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.716410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43c81a0a9a01b0c05b287312dc013cbc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-58.ec2.internal\" (UID: \"43c81a0a9a01b0c05b287312dc013cbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.812724 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.812663 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:55.843082 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.843058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.846255 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:55.846239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:55.913314 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:55.913275 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.013745 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.013717 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.114141 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.114078 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.168446 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.168425 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:56.214444 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.214415 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.217572 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.217559 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:47:56.217677 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.217660 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:56.217734 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.217711 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:56.312941 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.312818 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:56.314890 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.314869 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.333053 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.333033 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:56.334980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.334949 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:42:55 +0000 UTC" deadline="2027-10-03 23:40:43.309757189 +0000 UTC" Apr 16 16:47:56.335039 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.334981 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12846h52m46.974779725s" Apr 16 16:47:56.344040 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:56.344008 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82016c9c370b8a67ea924de37e34c743.slice/crio-511a09cb9ab0d81456ccf65153169116fbab54537a04aa27152bafa5c961aa29 WatchSource:0}: Error finding container 511a09cb9ab0d81456ccf65153169116fbab54537a04aa27152bafa5c961aa29: Status 404 returned error can't find the container with id 511a09cb9ab0d81456ccf65153169116fbab54537a04aa27152bafa5c961aa29 Apr 16 16:47:56.344496 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:56.344470 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c81a0a9a01b0c05b287312dc013cbc.slice/crio-cf0cae083abcf27cff0d27821d01687a6f78fdf8fc1d774583202a21ee7bebe0 WatchSource:0}: Error finding container cf0cae083abcf27cff0d27821d01687a6f78fdf8fc1d774583202a21ee7bebe0: Status 404 returned error can't find the container with id cf0cae083abcf27cff0d27821d01687a6f78fdf8fc1d774583202a21ee7bebe0 Apr 16 16:47:56.348396 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.348366 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:47:56.356285 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.356267 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-plzld" Apr 16 16:47:56.364434 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.364348 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-plzld" Apr 16 16:47:56.408364 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.408345 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:56.415855 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.415836 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.418607 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.418563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" event={"ID":"43c81a0a9a01b0c05b287312dc013cbc","Type":"ContainerStarted","Data":"cf0cae083abcf27cff0d27821d01687a6f78fdf8fc1d774583202a21ee7bebe0"} Apr 16 16:47:56.419504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.419484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" event={"ID":"82016c9c370b8a67ea924de37e34c743","Type":"ContainerStarted","Data":"511a09cb9ab0d81456ccf65153169116fbab54537a04aa27152bafa5c961aa29"} Apr 16 16:47:56.515900 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:56.515872 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-58.ec2.internal\" not found" Apr 16 16:47:56.590868 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.590846 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:56.613260 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.613234 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" Apr 16 16:47:56.627401 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.627340 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:56.628318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.628305 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" Apr 16 16:47:56.637928 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:56.637906 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:57.276596 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.276569 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:57.291227 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.291201 2573 apiserver.go:52] "Watching apiserver" Apr 16 16:47:57.301203 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.301176 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:47:57.302982 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.302956 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps","openshift-cluster-node-tuning-operator/tuned-tg8h2","openshift-dns/node-resolver-c8bgf","openshift-image-registry/node-ca-49l6w","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal","openshift-multus/multus-77mp4","openshift-multus/multus-additional-cni-plugins-t7jrv","openshift-multus/network-metrics-daemon-x6gbd","openshift-network-diagnostics/network-check-target-md7k7","openshift-network-operator/iptables-alerter-8ghz4","openshift-ovn-kubernetes/ovnkube-node-vts2x","kube-system/konnectivity-agent-ms2xr","kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal"] Apr 16 16:47:57.304442 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.304420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.305477 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.305456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.306459 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.306441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.307209 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.307179 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:47:57.307209 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.307194 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nxv7k\"" Apr 16 16:47:57.307209 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.307202 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.307681 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.307660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.308328 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:47:57.308328 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308322 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308523 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6d8vz\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308586 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308708 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.308910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.308846 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.309249 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.309002 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9m6xk\"" Apr 16 16:47:57.310005 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.309988 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:47:57.310098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.310035 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.310156 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.310109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.311458 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.310942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6pxz4\"" Apr 16 16:47:57.311458 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.311011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.313130 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.312894 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.313130 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.312927 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:47:57.313633 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.313410 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.313633 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.313437 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.313633 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.313531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.313633 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.313526 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:47:57.314100 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.314082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-56j9w\"" Apr 16 16:47:57.315082 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.315034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:57.315186 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.315098 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:47:57.316046 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.316028 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6m8m\"" Apr 16 16:47:57.316125 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.316057 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:47:57.316475 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.316459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.317865 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.317842 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.319015 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.318971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.319425 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.319407 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:47:57.319507 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.319445 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.319507 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.319469 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjxrk\"" Apr 16 16:47:57.319771 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.319753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.320432 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.320414 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:47:57.320740 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.320722 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2sb4n\"" Apr 16 16:47:57.321038 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.320986 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:47:57.321101 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321052 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.321284 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321163 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:47:57.321477 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321457 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.321554 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnlvd\"" Apr 16 16:47:57.321608 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321552 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:47:57.321688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:47:57.321810 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.321793 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:47:57.324909 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.324887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-multus-certs\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.324919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpfm4\" (UniqueName: \"kubernetes.io/projected/e050081a-408e-4bfd-b874-def2bfd7f635-kube-api-access-jpfm4\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.324944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-system-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.324969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-multus\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.324992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-hostroot\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325017 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-etc-selinux\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38487532-b4be-41ef-a345-cea3cc5a643c-hosts-file\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-sys\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325108 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-lib-modules\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-bin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-device-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-sys-fs\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.325251 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3cab79a-4ea1-4744-b205-fd85c929391f-host\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f8sc\" (UniqueName: \"kubernetes.io/projected/a3cab79a-4ea1-4744-b205-fd85c929391f-kube-api-access-4f8sc\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-cnibin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-cni-binary-copy\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd66d\" (UniqueName: \"kubernetes.io/projected/496bd28d-40d9-43b2-91c6-462df146eecc-kube-api-access-dd66d\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzdj8\" (UniqueName: \"kubernetes.io/projected/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kube-api-access-gzdj8\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-conf\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-os-release\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-etc-tuned\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysconfig\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.325644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-kubernetes\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-kubelet\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325720 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-system-cni-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-var-lib-kubelet\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-socket-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-cnibin\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k928\" (UniqueName: \"kubernetes.io/projected/50367a6c-7164-45c2-b2f1-af3375aa5768-kube-api-access-2k928\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.325962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-run\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3cab79a-4ea1-4744-b205-fd85c929391f-serviceca\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-socket-dir-parent\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326077 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-registration-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwk7\" (UniqueName: \"kubernetes.io/projected/38487532-b4be-41ef-a345-cea3cc5a643c-kube-api-access-svwk7\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mj2\" (UniqueName: \"kubernetes.io/projected/f7e26d85-638f-42c1-9b32-67320a5cbbe3-kube-api-access-g2mj2\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-etc-kubernetes\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-modprobe-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-netns\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-systemd\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-host\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-k8s-cni-cncf-io\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-multus-daemon-config\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-os-release\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38487532-b4be-41ef-a345-cea3cc5a643c-tmp-dir\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-tmp\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.326855 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.326369 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-conf-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.364957 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.364926 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:56 +0000 UTC" deadline="2028-01-29 10:18:05.671395198 +0000 UTC" Apr 16 16:47:57.364957 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.364956 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15665h30m8.306442684s" Apr 16 16:47:57.414319 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.414298 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:47:57.426558 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.426681 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:57.426681 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.426681 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsl6f\" (UniqueName: \"kubernetes.io/projected/06481c75-0539-4928-baa6-a9ed683f7054-kube-api-access-fsl6f\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.426681 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-var-lib-kubelet\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-socket-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-cnibin\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k928\" (UniqueName: \"kubernetes.io/projected/50367a6c-7164-45c2-b2f1-af3375aa5768-kube-api-access-2k928\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-var-lib-kubelet\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.426851 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-run\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1950f42e-3894-4436-a1c6-d5e65379ba61-konnectivity-ca\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3cab79a-4ea1-4744-b205-fd85c929391f-serviceca\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-socket-dir-parent\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.426978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-registration-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svwk7\" (UniqueName: \"kubernetes.io/projected/38487532-b4be-41ef-a345-cea3cc5a643c-kube-api-access-svwk7\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-socket-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mj2\" (UniqueName: \"kubernetes.io/projected/f7e26d85-638f-42c1-9b32-67320a5cbbe3-kube-api-access-g2mj2\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-ovn\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-run\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-cnibin\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.427098 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovn-node-metrics-cert\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-etc-kubernetes\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-registration-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-bin\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-env-overrides\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsd2c\" (UniqueName: \"kubernetes.io/projected/cc2084a8-5ef5-4dde-bae7-f84589b59b40-kube-api-access-lsd2c\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-modprobe-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-netns\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-systemd\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-host\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-netns\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-etc-kubernetes\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-host\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427444 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-socket-dir-parent\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-k8s-cni-cncf-io\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-systemd\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.427699 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.427499 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-multus-daemon-config\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3cab79a-4ea1-4744-b205-fd85c929391f-serviceca\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-k8s-cni-cncf-io\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-modprobe-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-os-release\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.427621 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:57.927605969 +0000 UTC m=+3.068324688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38487532-b4be-41ef-a345-cea3cc5a643c-tmp-dir\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-os-release\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-netd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-tmp\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-conf-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-multus-certs\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-systemd-units\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.428318 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-d\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-slash\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38487532-b4be-41ef-a345-cea3cc5a643c-tmp-dir\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-var-lib-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-run-multus-certs\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.427981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-conf-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-script-lib\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpfm4\" (UniqueName: \"kubernetes.io/projected/e050081a-408e-4bfd-b874-def2bfd7f635-kube-api-access-jpfm4\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428095 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-system-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-multus\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-hostroot\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-etc-selinux\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38487532-b4be-41ef-a345-cea3cc5a643c-hosts-file\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-hostroot\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-multus\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-etc-selinux\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38487532-b4be-41ef-a345-cea3cc5a643c-hosts-file\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.429112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-system-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-netns\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1950f42e-3894-4436-a1c6-d5e65379ba61-agent-certs\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-sys\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-lib-modules\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-bin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-device-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-sys-fs\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06481c75-0539-4928-baa6-a9ed683f7054-iptables-alerter-script\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06481c75-0539-4928-baa6-a9ed683f7054-host-slash\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-multus-daemon-config\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3cab79a-4ea1-4744-b205-fd85c929391f-host\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3cab79a-4ea1-4744-b205-fd85c929391f-host\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f8sc\" (UniqueName: \"kubernetes.io/projected/a3cab79a-4ea1-4744-b205-fd85c929391f-kube-api-access-4f8sc\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428809 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-device-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-sys\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.429837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-cni-bin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-sys-fs\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-cnibin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.428989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-cni-binary-copy\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd66d\" (UniqueName: \"kubernetes.io/projected/496bd28d-40d9-43b2-91c6-462df146eecc-kube-api-access-dd66d\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429023 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-cnibin\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-lib-modules\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzdj8\" (UniqueName: \"kubernetes.io/projected/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kube-api-access-gzdj8\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429054 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50367a6c-7164-45c2-b2f1-af3375aa5768-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-log-socket\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-conf\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-os-release\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-kubelet\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-config\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-etc-tuned\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysctl-conf\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430295 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-systemd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-os-release\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-etc-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429352 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-node-log\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysconfig\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-kubernetes\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-kubelet\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-system-cni-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/496bd28d-40d9-43b2-91c6-462df146eecc-cni-binary-copy\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429508 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-kubernetes\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50367a6c-7164-45c2-b2f1-af3375aa5768-system-cni-dir\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e050081a-408e-4bfd-b874-def2bfd7f635-etc-sysconfig\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429578 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-multus-cni-dir\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.430775 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.429585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/496bd28d-40d9-43b2-91c6-462df146eecc-host-var-lib-kubelet\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.431289 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.430942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-tmp\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.431340 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.431301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e050081a-408e-4bfd-b874-def2bfd7f635-etc-tuned\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.434824 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.434796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwk7\" (UniqueName: \"kubernetes.io/projected/38487532-b4be-41ef-a345-cea3cc5a643c-kube-api-access-svwk7\") pod \"node-resolver-c8bgf\" (UID: \"38487532-b4be-41ef-a345-cea3cc5a643c\") " pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.435478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.435433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k928\" (UniqueName: \"kubernetes.io/projected/50367a6c-7164-45c2-b2f1-af3375aa5768-kube-api-access-2k928\") pod \"multus-additional-cni-plugins-t7jrv\" (UID: \"50367a6c-7164-45c2-b2f1-af3375aa5768\") " pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.435980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.435957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mj2\" (UniqueName: \"kubernetes.io/projected/f7e26d85-638f-42c1-9b32-67320a5cbbe3-kube-api-access-g2mj2\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.436486 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.436459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpfm4\" (UniqueName: \"kubernetes.io/projected/e050081a-408e-4bfd-b874-def2bfd7f635-kube-api-access-jpfm4\") pod \"tuned-tg8h2\" (UID: \"e050081a-408e-4bfd-b874-def2bfd7f635\") " pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.437032 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.437013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f8sc\" (UniqueName: \"kubernetes.io/projected/a3cab79a-4ea1-4744-b205-fd85c929391f-kube-api-access-4f8sc\") pod \"node-ca-49l6w\" (UID: \"a3cab79a-4ea1-4744-b205-fd85c929391f\") " pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.437236 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.437212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzdj8\" (UniqueName: \"kubernetes.io/projected/a3e5fc61-deb6-4697-bcc8-92eee1a15876-kube-api-access-gzdj8\") pod \"aws-ebs-csi-driver-node-kjtps\" (UID: \"a3e5fc61-deb6-4697-bcc8-92eee1a15876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.437466 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.437449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd66d\" (UniqueName: \"kubernetes.io/projected/496bd28d-40d9-43b2-91c6-462df146eecc-kube-api-access-dd66d\") pod \"multus-77mp4\" (UID: \"496bd28d-40d9-43b2-91c6-462df146eecc\") " pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.530008 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.529923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1950f42e-3894-4436-a1c6-d5e65379ba61-konnectivity-ca\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.530008 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.529966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-ovn\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530008 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.529992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovn-node-metrics-cert\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-bin\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-env-overrides\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsd2c\" (UniqueName: \"kubernetes.io/projected/cc2084a8-5ef5-4dde-bae7-f84589b59b40-kube-api-access-lsd2c\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530086 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-netd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-systemd-units\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-slash\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-var-lib-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-script-lib\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-netns\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1950f42e-3894-4436-a1c6-d5e65379ba61-agent-certs\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06481c75-0539-4928-baa6-a9ed683f7054-iptables-alerter-script\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06481c75-0539-4928-baa6-a9ed683f7054-host-slash\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-log-socket\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-kubelet\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-config\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-systemd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-etc-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-node-log\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-var-lib-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06481c75-0539-4928-baa6-a9ed683f7054-host-slash\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-netd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-systemd-units\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.530706 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-ovn\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-slash\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-systemd\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-log-socket\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-cni-bin\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-etc-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-node-log\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-run-openvswitch\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-run-netns\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.530586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsl6f\" (UniqueName: \"kubernetes.io/projected/06481c75-0539-4928-baa6-a9ed683f7054-kube-api-access-fsl6f\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-kubelet\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-script-lib\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531328 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc2084a8-5ef5-4dde-bae7-f84589b59b40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovnkube-config\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06481c75-0539-4928-baa6-a9ed683f7054-iptables-alerter-script\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc2084a8-5ef5-4dde-bae7-f84589b59b40-env-overrides\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.531462 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.531439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1950f42e-3894-4436-a1c6-d5e65379ba61-konnectivity-ca\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.533802 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.533783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1950f42e-3894-4436-a1c6-d5e65379ba61-agent-certs\") pod \"konnectivity-agent-ms2xr\" (UID: \"1950f42e-3894-4436-a1c6-d5e65379ba61\") " pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.534115 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.534099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc2084a8-5ef5-4dde-bae7-f84589b59b40-ovn-node-metrics-cert\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.538093 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.538070 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:57.538179 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.538097 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:57.538179 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.538108 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.538179 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.538178 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.038162263 +0000 UTC m=+3.178880985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.539975 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.539938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsl6f\" (UniqueName: \"kubernetes.io/projected/06481c75-0539-4928-baa6-a9ed683f7054-kube-api-access-fsl6f\") pod \"iptables-alerter-8ghz4\" (UID: \"06481c75-0539-4928-baa6-a9ed683f7054\") " pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.540073 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.540061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsd2c\" (UniqueName: \"kubernetes.io/projected/cc2084a8-5ef5-4dde-bae7-f84589b59b40-kube-api-access-lsd2c\") pod \"ovnkube-node-vts2x\" (UID: \"cc2084a8-5ef5-4dde-bae7-f84589b59b40\") " pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.617780 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.617751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" Apr 16 16:47:57.624449 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.624428 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" Apr 16 16:47:57.632010 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.631988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c8bgf" Apr 16 16:47:57.638553 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.638531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49l6w" Apr 16 16:47:57.646142 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.646119 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" Apr 16 16:47:57.651726 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.651708 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-77mp4" Apr 16 16:47:57.658304 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.658286 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8ghz4" Apr 16 16:47:57.664917 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.664897 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:47:57.669533 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.669514 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:47:57.935643 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:57.935563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:57.935768 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.935710 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.935817 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:57.935769 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.935754869 +0000 UTC m=+4.076473593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.962695 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.962609 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2084a8_5ef5_4dde_bae7_f84589b59b40.slice/crio-ee1d059cf5a7b64797bc325fa66d950f916a2868e9a3a844fb7e9fdcc27e4608 WatchSource:0}: Error finding container ee1d059cf5a7b64797bc325fa66d950f916a2868e9a3a844fb7e9fdcc27e4608: Status 404 returned error can't find the container with id ee1d059cf5a7b64797bc325fa66d950f916a2868e9a3a844fb7e9fdcc27e4608 Apr 16 16:47:57.964416 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.964392 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode050081a_408e_4bfd_b874_def2bfd7f635.slice/crio-15c119f92a59e03e782500ebf6d9546f2ba9bf8a5be1d46d0816d45401ad2fbe WatchSource:0}: Error finding container 15c119f92a59e03e782500ebf6d9546f2ba9bf8a5be1d46d0816d45401ad2fbe: Status 404 returned error can't find the container with id 15c119f92a59e03e782500ebf6d9546f2ba9bf8a5be1d46d0816d45401ad2fbe Apr 16 16:47:57.965344 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.965322 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3cab79a_4ea1_4744_b205_fd85c929391f.slice/crio-9afc1dc770ab873f71ce11b176b5d886fdb6637683588cdd67f47c781a8f6c71 WatchSource:0}: Error finding container 9afc1dc770ab873f71ce11b176b5d886fdb6637683588cdd67f47c781a8f6c71: Status 404 returned error can't find the container with id 9afc1dc770ab873f71ce11b176b5d886fdb6637683588cdd67f47c781a8f6c71 Apr 16 16:47:57.969643 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.969622 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1950f42e_3894_4436_a1c6_d5e65379ba61.slice/crio-212a1d40943b5af56b6854137fcc1239a5045521ff6e807f6cac500bd53b0c36 WatchSource:0}: Error finding container 212a1d40943b5af56b6854137fcc1239a5045521ff6e807f6cac500bd53b0c36: Status 404 returned error can't find the container with id 212a1d40943b5af56b6854137fcc1239a5045521ff6e807f6cac500bd53b0c36 Apr 16 16:47:57.971668 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.971648 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e5fc61_deb6_4697_bcc8_92eee1a15876.slice/crio-50307aac718e3c00732efa34e209535619f28d7702f57f62f080438390021f58 WatchSource:0}: Error finding container 50307aac718e3c00732efa34e209535619f28d7702f57f62f080438390021f58: Status 404 returned error can't find the container with id 50307aac718e3c00732efa34e209535619f28d7702f57f62f080438390021f58 Apr 16 16:47:57.972530 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:47:57.972473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38487532_b4be_41ef_a345_cea3cc5a643c.slice/crio-5a07e67452bec3454a98d7a816a70210a3e0c3eae36c46d7372f86e0890e7dec WatchSource:0}: Error finding container 5a07e67452bec3454a98d7a816a70210a3e0c3eae36c46d7372f86e0890e7dec: Status 404 returned error can't find the container with id 5a07e67452bec3454a98d7a816a70210a3e0c3eae36c46d7372f86e0890e7dec Apr 16 16:47:58.136607 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.136569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:58.136741 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.136719 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:58.136741 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.136735 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:58.136833 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.136743 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:58.136833 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.136795 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.136780717 +0000 UTC m=+4.277499437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:58.347286 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.347199 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:58.365635 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.365574 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:56 +0000 UTC" deadline="2027-12-04 14:22:47.633270823 +0000 UTC" Apr 16 16:47:58.365635 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.365607 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14325h34m49.267667622s" Apr 16 16:47:58.425910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.425841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" event={"ID":"e050081a-408e-4bfd-b874-def2bfd7f635","Type":"ContainerStarted","Data":"15c119f92a59e03e782500ebf6d9546f2ba9bf8a5be1d46d0816d45401ad2fbe"} Apr 16 16:47:58.433088 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.433028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c8bgf" event={"ID":"38487532-b4be-41ef-a345-cea3cc5a643c","Type":"ContainerStarted","Data":"5a07e67452bec3454a98d7a816a70210a3e0c3eae36c46d7372f86e0890e7dec"} Apr 16 16:47:58.437054 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.437002 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-77mp4" event={"ID":"496bd28d-40d9-43b2-91c6-462df146eecc","Type":"ContainerStarted","Data":"bd8cb270e0da094fb787d43206ad3030db5494f262222482bfc1f276c18978b1"} Apr 16 16:47:58.439730 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.439697 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerStarted","Data":"52ff80f83c431ef34ae78cc9a1f62a8440685805a7e9d1391dc985bf2f3a5615"} Apr 16 16:47:58.449692 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.449667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49l6w" event={"ID":"a3cab79a-4ea1-4744-b205-fd85c929391f","Type":"ContainerStarted","Data":"9afc1dc770ab873f71ce11b176b5d886fdb6637683588cdd67f47c781a8f6c71"} Apr 16 16:47:58.454364 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.454335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"ee1d059cf5a7b64797bc325fa66d950f916a2868e9a3a844fb7e9fdcc27e4608"} Apr 16 16:47:58.462119 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.462095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" event={"ID":"43c81a0a9a01b0c05b287312dc013cbc","Type":"ContainerStarted","Data":"33a2aac2ff36bdf54375aab68a1c93ac07ba09639368c206c4367a130a6b832d"} Apr 16 16:47:58.466854 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.466834 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8ghz4" event={"ID":"06481c75-0539-4928-baa6-a9ed683f7054","Type":"ContainerStarted","Data":"947eb34c8496319643c5b00ba82fd5953fe3da6683ce7ebd442677ab5cb3c766"} Apr 16 16:47:58.478466 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.478434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" event={"ID":"a3e5fc61-deb6-4697-bcc8-92eee1a15876","Type":"ContainerStarted","Data":"50307aac718e3c00732efa34e209535619f28d7702f57f62f080438390021f58"} Apr 16 16:47:58.487356 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.487327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ms2xr" event={"ID":"1950f42e-3894-4436-a1c6-d5e65379ba61","Type":"ContainerStarted","Data":"212a1d40943b5af56b6854137fcc1239a5045521ff6e807f6cac500bd53b0c36"} Apr 16 16:47:58.943990 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:58.943132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:58.943990 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.943316 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:58.943990 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:58.943431 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:00.94336112 +0000 UTC m=+6.084079842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:59.144911 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.144821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:59.145070 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.145011 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:59.145070 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.145029 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:59.145070 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.145041 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:59.145219 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.145095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.14507822 +0000 UTC m=+6.285796960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:59.416932 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.416781 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:47:59.416932 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.416781 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:47:59.417448 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.416935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:47:59.417448 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:47:59.417107 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:47:59.533321 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.533280 2573 generic.go:358] "Generic (PLEG): container finished" podID="82016c9c370b8a67ea924de37e34c743" containerID="76c2bf50e3d3dab2543457a6fcb88345adcfd68200643ba310a804f2bd84f6af" exitCode=0 Apr 16 16:47:59.534298 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.534265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" event={"ID":"82016c9c370b8a67ea924de37e34c743","Type":"ContainerDied","Data":"76c2bf50e3d3dab2543457a6fcb88345adcfd68200643ba310a804f2bd84f6af"} Apr 16 16:47:59.551784 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:47:59.551735 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-58.ec2.internal" podStartSLOduration=3.551717095 podStartE2EDuration="3.551717095s" podCreationTimestamp="2026-04-16 16:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:58.478591589 +0000 UTC m=+3.619310332" watchObservedRunningTime="2026-04-16 16:47:59.551717095 +0000 UTC m=+4.692435838" Apr 16 16:48:00.540900 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:00.540861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" event={"ID":"82016c9c370b8a67ea924de37e34c743","Type":"ContainerStarted","Data":"269501033b27e2908f247200915ce38926e7ebbad7945ebf3877bbfaa9bc685f"} Apr 16 16:48:00.557833 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:00.557567 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-58.ec2.internal" podStartSLOduration=4.557548488 podStartE2EDuration="4.557548488s" podCreationTimestamp="2026-04-16 16:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:00.556919702 +0000 UTC m=+5.697638444" watchObservedRunningTime="2026-04-16 16:48:00.557548488 +0000 UTC m=+5.698267234" Apr 16 16:48:00.959442 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:00.959344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:00.959604 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:00.959541 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:00.959674 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:00.959609 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:04.959590047 +0000 UTC m=+10.100308788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:01.161785 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:01.161514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:01.161785 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.161710 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:01.161785 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.161732 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:01.161785 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.161745 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:01.162144 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.161813 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.161792383 +0000 UTC m=+10.302511125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:01.417242 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:01.417162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:01.417412 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:01.417169 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:01.417412 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.417318 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:01.417412 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:01.417373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:03.424652 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.424626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:03.425704 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:03.425200 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:03.425704 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.424694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:03.425704 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:03.425671 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:03.784867 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.784433 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4zxj2"] Apr 16 16:48:03.787293 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.787269 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.787440 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:03.787351 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:03.884084 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.883947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-kubelet-config\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.884084 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.884012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.884084 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.884044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-dbus\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.984993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.984840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-kubelet-config\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.984993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.984900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.984993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.984934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-dbus\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.984993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.984934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-kubelet-config\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.984993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:03.984986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/59cc8b51-5a0b-45ea-8e53-f1473e78b939-dbus\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:03.985363 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:03.985029 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:03.985363 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:03.985103 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:04.485079854 +0000 UTC m=+9.625798587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:04.490175 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:04.489557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:04.490175 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:04.489737 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:04.490175 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:04.489806 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.489784437 +0000 UTC m=+10.630503160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:04.993554 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:04.993078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:04.993554 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:04.993259 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:04.993554 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:04.993320 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:12.993302655 +0000 UTC m=+18.134021387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:05.194553 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:05.194518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:05.194814 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.194700 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:05.194814 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.194723 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:05.194814 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.194736 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.194814 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.194795 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:13.194777617 +0000 UTC m=+18.335496340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.419869 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:05.419432 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:05.419869 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:05.419754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:05.419869 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.419755 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:05.419869 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.419829 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:05.420137 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:05.419878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:05.420137 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.419977 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:05.497293 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:05.497264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:05.497714 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.497402 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:05.497714 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:05.497470 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:07.497450742 +0000 UTC m=+12.638169479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:07.416667 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:07.416532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:07.416667 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:07.416532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:07.417133 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:07.416532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:07.417133 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:07.416741 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:07.417133 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:07.416650 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:07.417133 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:07.416848 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:07.513487 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:07.513460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:07.513625 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:07.513572 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:07.513625 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:07.513624 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:11.513609671 +0000 UTC m=+16.654328394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:09.416878 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:09.416847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:09.417313 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:09.416847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:09.417313 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:09.416977 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:09.417313 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:09.417070 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:09.417313 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:09.416848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:09.417313 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:09.417154 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:11.416877 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:11.416828 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:11.417315 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:11.416828 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:11.417315 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:11.416958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:11.417315 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:11.416955 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:11.417315 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:11.417051 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:11.417315 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:11.417148 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:11.545028 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:11.544995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:11.545185 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:11.545119 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:11.545185 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:11.545176 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:19.545158255 +0000 UTC m=+24.685876975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:13.059213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:13.059181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:13.059706 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.059316 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:13.059706 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.059388 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.059361305 +0000 UTC m=+34.200080025 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:13.260453 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:13.260423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:13.260614 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.260550 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:13.260614 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.260566 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:13.260614 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.260573 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f7dsv for pod openshift-network-diagnostics/network-check-target-md7k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:13.260739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.260617 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv podName:41dd0be2-82c4-4469-b8d9-d1b98a4adb55 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.260605105 +0000 UTC m=+34.401323826 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7dsv" (UniqueName: "kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv") pod "network-check-target-md7k7" (UID: "41dd0be2-82c4-4469-b8d9-d1b98a4adb55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:13.416446 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:13.416358 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:13.416596 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:13.416361 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:13.416596 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.416488 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:13.416596 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.416559 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:13.416596 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:13.416361 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:13.416804 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:13.416699 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:15.418459 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.418434 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:15.419119 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:15.418557 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:15.419119 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.418627 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:15.419119 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:15.418674 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:15.419545 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.419495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:15.419603 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:15.419577 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:15.567975 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.567813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" event={"ID":"a3e5fc61-deb6-4697-bcc8-92eee1a15876","Type":"ContainerStarted","Data":"55579c61659a9a9788a5896b5a118512f84cc0872767c7e2ed5e04e86e79f35f"} Apr 16 16:48:15.569011 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.568986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ms2xr" event={"ID":"1950f42e-3894-4436-a1c6-d5e65379ba61","Type":"ContainerStarted","Data":"a191187642332d5948c46ced27ec56dcc4aa552761794e5a4ab5a5265a414cac"} Apr 16 16:48:15.570136 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.570116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" event={"ID":"e050081a-408e-4bfd-b874-def2bfd7f635","Type":"ContainerStarted","Data":"a28e5d8be5e32e3239cd26886676277cbcb3255ee7f7575c5cdb6cf0e1de1a28"} Apr 16 16:48:15.571232 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.571208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c8bgf" event={"ID":"38487532-b4be-41ef-a345-cea3cc5a643c","Type":"ContainerStarted","Data":"0d32f2dfc6d56e615d947fd73bbe4764916a5d3c9806186fb208cceea34fa4fa"} Apr 16 16:48:15.572335 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.572313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-77mp4" event={"ID":"496bd28d-40d9-43b2-91c6-462df146eecc","Type":"ContainerStarted","Data":"ec1dddf5984a732a54a778359049e6debbd59ad070ea12d5750117f908eed4c5"} Apr 16 16:48:15.573496 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.573473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerStarted","Data":"308b878c122c8e9920e4de1121a93466a02f6470759e33d767abc058bd5d3b01"} Apr 16 16:48:15.574607 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.574590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49l6w" event={"ID":"a3cab79a-4ea1-4744-b205-fd85c929391f","Type":"ContainerStarted","Data":"1ce3ae11931c62dcb87399aa3c21f6101258ab7241c7682e428d36f20cfb519c"} Apr 16 16:48:15.586247 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.586213 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ms2xr" podStartSLOduration=3.400529588 podStartE2EDuration="20.586203142s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.971103692 +0000 UTC m=+3.111822411" lastFinishedPulling="2026-04-16 16:48:15.156777237 +0000 UTC m=+20.297495965" observedRunningTime="2026-04-16 16:48:15.585826875 +0000 UTC m=+20.726545629" watchObservedRunningTime="2026-04-16 16:48:15.586203142 +0000 UTC m=+20.726921884" Apr 16 16:48:15.640293 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.640237 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tg8h2" podStartSLOduration=3.446774342 podStartE2EDuration="20.640220644s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.96773299 +0000 UTC m=+3.108451710" lastFinishedPulling="2026-04-16 16:48:15.161179286 +0000 UTC m=+20.301898012" observedRunningTime="2026-04-16 16:48:15.627245626 +0000 UTC m=+20.767964371" watchObservedRunningTime="2026-04-16 16:48:15.640220644 +0000 UTC m=+20.780939386" Apr 16 16:48:15.643418 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.643363 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c8bgf" podStartSLOduration=3.46111411 podStartE2EDuration="20.643351561s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.974715577 +0000 UTC m=+3.115434305" lastFinishedPulling="2026-04-16 16:48:15.156953035 +0000 UTC m=+20.297671756" observedRunningTime="2026-04-16 16:48:15.643013894 +0000 UTC m=+20.783732636" watchObservedRunningTime="2026-04-16 16:48:15.643351561 +0000 UTC m=+20.784070302" Apr 16 16:48:15.699578 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.699519 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-49l6w" podStartSLOduration=8.215254086 podStartE2EDuration="20.699501218s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.967805812 +0000 UTC m=+3.108524550" lastFinishedPulling="2026-04-16 16:48:10.452052941 +0000 UTC m=+15.592771682" observedRunningTime="2026-04-16 16:48:15.666058347 +0000 UTC m=+20.806777089" watchObservedRunningTime="2026-04-16 16:48:15.699501218 +0000 UTC m=+20.840219960" Apr 16 16:48:15.699716 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:15.699686 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-77mp4" podStartSLOduration=3.334179877 podStartE2EDuration="20.699675671s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.976993929 +0000 UTC m=+3.117712653" lastFinishedPulling="2026-04-16 16:48:15.342489725 +0000 UTC m=+20.483208447" observedRunningTime="2026-04-16 16:48:15.698811318 +0000 UTC m=+20.839530065" watchObservedRunningTime="2026-04-16 16:48:15.699675671 +0000 UTC m=+20.840394414" Apr 16 16:48:16.291237 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.291038 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:48:16.389900 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.389801 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:48:16.291234363Z","UUID":"07dafb92-20af-485d-8f49-11875616ba71","Handler":null,"Name":"","Endpoint":""} Apr 16 16:48:16.391205 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.391184 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:48:16.391299 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.391210 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:48:16.577501 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.577473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" event={"ID":"a3e5fc61-deb6-4697-bcc8-92eee1a15876","Type":"ContainerStarted","Data":"90b3748cd7bc97f953fefaead31845ee4a6554989f3d75563fa99f0ab5ec5108"} Apr 16 16:48:16.578701 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.578680 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="308b878c122c8e9920e4de1121a93466a02f6470759e33d767abc058bd5d3b01" exitCode=0 Apr 16 16:48:16.578802 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.578747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"308b878c122c8e9920e4de1121a93466a02f6470759e33d767abc058bd5d3b01"} Apr 16 16:48:16.581478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"b304a44cfaaed8765d22b8a83224979d383be1f3296de3a1c92853a61b7015a9"} Apr 16 16:48:16.581574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"5a3bc130f53068b6ec0557e53d0c47ea25812b213933ab58f3aac2cacdd1305f"} Apr 16 16:48:16.581574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"fc3411ed65c3ad878250d43f308ade5bc2bb15291806d3ead76ede4a4452a8ae"} Apr 16 16:48:16.581574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581516 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"5856e48006b1082e6a9915138be7596deba39403bf464e53b5dd0b02cc0219c6"} Apr 16 16:48:16.581574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"e616ef2a27fd2704f3676d793f56896d152df2dc99a37e72922187a5ffe73f06"} Apr 16 16:48:16.581574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:16.581539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"6cc29a4d78960554bb4c0d07fc2104859e00c4a15c8fbfc37d1f2decf2c2e395"} Apr 16 16:48:17.417219 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.417138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:17.417500 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:17.417237 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:17.417500 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.417247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:17.417500 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.417278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:17.417500 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:17.417316 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:17.417500 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:17.417460 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:17.584210 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.584177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8ghz4" event={"ID":"06481c75-0539-4928-baa6-a9ed683f7054","Type":"ContainerStarted","Data":"709c6e2eea53de8ff88d797d14e33a470c14ba8cc486fe9fe9bdfb0d443c9ee4"} Apr 16 16:48:17.586185 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.586157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" event={"ID":"a3e5fc61-deb6-4697-bcc8-92eee1a15876","Type":"ContainerStarted","Data":"6475c89ca5095d0ef9484f3e2d83303c26dbd5405cff84b2a7301140bd964b9d"} Apr 16 16:48:17.602759 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.602722 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8ghz4" podStartSLOduration=5.424856792 podStartE2EDuration="22.602710338s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.978900895 +0000 UTC m=+3.119619615" lastFinishedPulling="2026-04-16 16:48:15.156754438 +0000 UTC m=+20.297473161" observedRunningTime="2026-04-16 16:48:17.602580449 +0000 UTC m=+22.743299191" watchObservedRunningTime="2026-04-16 16:48:17.602710338 +0000 UTC m=+22.743429079" Apr 16 16:48:17.619827 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:17.619626 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kjtps" podStartSLOduration=3.5468605650000002 podStartE2EDuration="22.619608463s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.973494837 +0000 UTC m=+3.114213562" lastFinishedPulling="2026-04-16 16:48:17.04624274 +0000 UTC m=+22.186961460" observedRunningTime="2026-04-16 16:48:17.619131998 +0000 UTC m=+22.759850740" watchObservedRunningTime="2026-04-16 16:48:17.619608463 +0000 UTC m=+22.760327207" Apr 16 16:48:18.591282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:18.591184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"048a27c05cb4b734d64e56b96046adeb1ee35ee68c1dffce01d67341b8d6768c"} Apr 16 16:48:19.416914 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:19.416870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:19.417083 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:19.416870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:19.417083 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:19.416991 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:19.417187 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:19.417077 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:19.417187 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:19.416870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:19.417263 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:19.417185 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:19.606980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:19.606949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:19.607465 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:19.607098 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:19.607465 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:19.607168 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret podName:59cc8b51-5a0b-45ea-8e53-f1473e78b939 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.607148739 +0000 UTC m=+40.747867461 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret") pod "global-pull-secret-syncer-4zxj2" (UID: "59cc8b51-5a0b-45ea-8e53-f1473e78b939") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:20.430933 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:20.430900 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:48:20.431565 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:20.431540 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:48:21.416819 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.416616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:21.417336 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.416616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:21.417336 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:21.416888 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:21.417336 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.416628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:21.417336 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:21.417040 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:21.417336 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:21.417060 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:21.597551 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.597519 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="fa4ec732c30c46b41edc10fc3d22033c199d9f475e115948692eb7101b298759" exitCode=0 Apr 16 16:48:21.597679 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.597576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"fa4ec732c30c46b41edc10fc3d22033c199d9f475e115948692eb7101b298759"} Apr 16 16:48:21.601724 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.601697 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" event={"ID":"cc2084a8-5ef5-4dde-bae7-f84589b59b40","Type":"ContainerStarted","Data":"800523edd8ab595e6916fcb88bc92d74c2a74b57d39a0aa2b67eab4aefd1153c"} Apr 16 16:48:21.602118 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.602088 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:48:21.618184 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.618160 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:48:21.654228 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:21.654183 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" podStartSLOduration=8.910770887 podStartE2EDuration="26.654170541s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.964547221 +0000 UTC m=+3.105265958" lastFinishedPulling="2026-04-16 16:48:15.707946892 +0000 UTC m=+20.848665612" observedRunningTime="2026-04-16 16:48:21.653178393 +0000 UTC m=+26.793897134" watchObservedRunningTime="2026-04-16 16:48:21.654170541 +0000 UTC m=+26.794889344" Apr 16 16:48:22.548404 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.546845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4zxj2"] Apr 16 16:48:22.548404 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.547185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:22.548404 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:22.547315 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:22.550332 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.550304 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6gbd"] Apr 16 16:48:22.550484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.550453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:22.550689 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:22.550652 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:22.550922 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.550895 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-md7k7"] Apr 16 16:48:22.551007 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.550986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:22.551096 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:22.551069 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:22.604816 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.604795 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="9638e71d0dec8d92f0fa0e7ab69eab408a13b0f7dadd58cf27e32a730a1d514e" exitCode=0 Apr 16 16:48:22.604900 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.604882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"9638e71d0dec8d92f0fa0e7ab69eab408a13b0f7dadd58cf27e32a730a1d514e"} Apr 16 16:48:22.605076 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.605065 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:48:22.605986 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.605416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:48:22.619594 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:22.619578 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:48:23.509828 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:23.509760 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:48:23.608812 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:23.608781 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="aa5d63c13947ecc8fee385f414ce74eecc01feed4f84d84bcf45e62c7d3c85dd" exitCode=0 Apr 16 16:48:23.609164 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:23.608858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"aa5d63c13947ecc8fee385f414ce74eecc01feed4f84d84bcf45e62c7d3c85dd"} Apr 16 16:48:24.417292 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.417262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:24.417292 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.417284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:24.417538 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.417262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:24.417538 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:24.417363 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:24.417538 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:24.417478 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:24.417684 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:24.417557 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:24.503846 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.503815 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:48:24.504010 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.503945 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:48:24.504863 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:24.504835 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ms2xr" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:26.416855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:26.416902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:26.417001 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:26.417009 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4zxj2" podUID="59cc8b51-5a0b-45ea-8e53-f1473e78b939" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:26.417124 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6gbd" podUID="f7e26d85-638f-42c1-9b32-67320a5cbbe3" Apr 16 16:48:26.417399 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:26.417205 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md7k7" podUID="41dd0be2-82c4-4469-b8d9-d1b98a4adb55" Apr 16 16:48:28.166440 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.166351 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-58.ec2.internal" event="NodeReady" Apr 16 16:48:28.166995 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.166530 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:48:28.205569 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.205537 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:48:28.230265 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.230232 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch"] Apr 16 16:48:28.230438 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.230416 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.233443 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.233419 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lm58j\"" Apr 16 16:48:28.233698 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.233672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:48:28.233944 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.233914 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:48:28.234112 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.234044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:48:28.240091 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.240069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:48:28.244203 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.244083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps"] Apr 16 16:48:28.245139 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.244441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.247464 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.247445 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:48:28.247464 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.247458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:48:28.247623 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.247490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:48:28.247826 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.247810 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:48:28.247826 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.247822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:48:28.248042 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.248027 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:48:28.248213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.248198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:48:28.260555 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.260531 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx"] Apr 16 16:48:28.260725 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.260709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.264981 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.264960 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:48:28.278697 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278667 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx"] Apr 16 16:48:28.278697 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278693 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:48:28.278843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278708 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps"] Apr 16 16:48:28.278843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278720 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch"] Apr 16 16:48:28.278843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278734 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lhgcr"] Apr 16 16:48:28.278843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.278783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.282594 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.282575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:48:28.282696 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.282667 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-thjvj\"" Apr 16 16:48:28.297568 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.297546 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zt96l"] Apr 16 16:48:28.297726 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.297710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.300574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.300537 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:48:28.300744 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.300721 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fwjqf\"" Apr 16 16:48:28.300744 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.300737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:48:28.312064 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.312045 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lhgcr"] Apr 16 16:48:28.312150 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.312068 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zt96l"] Apr 16 16:48:28.312210 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.312170 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.314645 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.314625 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:48:28.314738 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.314692 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gwwfm\"" Apr 16 16:48:28.314836 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.314816 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:48:28.314936 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.314925 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:48:28.374512 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.374627 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.374627 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.374722 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qr7\" (UniqueName: \"kubernetes.io/projected/36ad6397-4200-4581-8948-4e5ecae47a04-kube-api-access-d5qr7\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.374722 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-tmp\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.374722 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.374841 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/536018da-d22a-4d5a-a54d-4d2116c68151-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.374841 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5g82\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.374910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmns\" (UniqueName: \"kubernetes.io/projected/536018da-d22a-4d5a-a54d-4d2116c68151-kube-api-access-mqmns\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.374910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7gx\" (UniqueName: \"kubernetes.io/projected/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-kube-api-access-fp7gx\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.374980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/36ad6397-4200-4581-8948-4e5ecae47a04-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.374980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.375061 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.374993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.375061 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.375138 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.375138 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.375138 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375123 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-klusterlet-config\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.375266 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.375266 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.375214 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.416869 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.416811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:28.416869 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.416826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:28.417020 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.416811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:28.420058 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420037 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:28.420177 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:48:28.420177 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420105 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:28.420280 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420191 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mkc4m\"" Apr 16 16:48:28.420280 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420232 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:48:28.420358 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.420301 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:28.476339 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.476478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.476478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.476478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.476478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.476693 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qr7\" (UniqueName: \"kubernetes.io/projected/36ad6397-4200-4581-8948-4e5ecae47a04-kube-api-access-d5qr7\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.476870 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.476969 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-tmp\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.476969 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.477092 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.476978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/536018da-d22a-4d5a-a54d-4d2116c68151-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.477092 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5g82\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.477092 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hgqz\" (UniqueName: \"kubernetes.io/projected/40e02308-a3e4-43c3-8e6d-b59cfe039143-kube-api-access-4hgqz\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.477092 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmns\" (UniqueName: \"kubernetes.io/projected/536018da-d22a-4d5a-a54d-4d2116c68151-kube-api-access-mqmns\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.477248 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7gx\" (UniqueName: \"kubernetes.io/projected/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-kube-api-access-fp7gx\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.477248 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbgg\" (UniqueName: \"kubernetes.io/projected/db86a360-38b7-4c87-ac77-176127220106-kube-api-access-4dbgg\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.477248 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.477248 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/36ad6397-4200-4581-8948-4e5ecae47a04-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.477248 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-tmp\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db86a360-38b7-4c87-ac77-176127220106-config-volume\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db86a360-38b7-4c87-ac77-176127220106-tmp-dir\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.477484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.477752 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.477752 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.477513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-klusterlet-config\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.478704 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.478487 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:28.478704 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.478505 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:28.478704 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.478557 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:28.97853803 +0000 UTC m=+34.119256751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:28.479326 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.479301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.479459 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.479413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/536018da-d22a-4d5a-a54d-4d2116c68151-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.480444 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.480400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.482001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.481553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.482001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.481562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.482001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.481854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/36ad6397-4200-4581-8948-4e5ecae47a04-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.482001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.481934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.482001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.481957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-ca\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.483247 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.483209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/536018da-d22a-4d5a-a54d-4d2116c68151-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.483745 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.483705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.484979 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.484957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7gx\" (UniqueName: \"kubernetes.io/projected/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-kube-api-access-fp7gx\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.485804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.485783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.485890 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.485872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qr7\" (UniqueName: \"kubernetes.io/projected/36ad6397-4200-4581-8948-4e5ecae47a04-kube-api-access-d5qr7\") pod \"managed-serviceaccount-addon-agent-cdff685c-x7jtx\" (UID: \"36ad6397-4200-4581-8948-4e5ecae47a04\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.486640 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.486617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5g82\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.487565 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.487545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmns\" (UniqueName: \"kubernetes.io/projected/536018da-d22a-4d5a-a54d-4d2116c68151-kube-api-access-mqmns\") pod \"cluster-proxy-proxy-agent-8596fb7f8-nnmch\" (UID: \"536018da-d22a-4d5a-a54d-4d2116c68151\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.492537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.492517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f-klusterlet-config\") pod \"klusterlet-addon-workmgr-8547679d7d-25lps\" (UID: \"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.559498 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.559469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:48:28.569398 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.569366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:28.578226 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hgqz\" (UniqueName: \"kubernetes.io/projected/40e02308-a3e4-43c3-8e6d-b59cfe039143-kube-api-access-4hgqz\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.578321 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbgg\" (UniqueName: \"kubernetes.io/projected/db86a360-38b7-4c87-ac77-176127220106-kube-api-access-4dbgg\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.578321 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.578458 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.578429 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:28.578514 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.578497 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.078477402 +0000 UTC m=+34.219196142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:28.578572 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db86a360-38b7-4c87-ac77-176127220106-config-volume\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.578572 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db86a360-38b7-4c87-ac77-176127220106-tmp-dir\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.578669 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.578782 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.578763 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:28.578846 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.578820 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.078805455 +0000 UTC m=+34.219524175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:28.578993 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.578970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db86a360-38b7-4c87-ac77-176127220106-tmp-dir\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.579083 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.579034 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db86a360-38b7-4c87-ac77-176127220106-config-volume\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.587279 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.587260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" Apr 16 16:48:28.589587 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.589565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbgg\" (UniqueName: \"kubernetes.io/projected/db86a360-38b7-4c87-ac77-176127220106-kube-api-access-4dbgg\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:28.589845 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.589825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hgqz\" (UniqueName: \"kubernetes.io/projected/40e02308-a3e4-43c3-8e6d-b59cfe039143-kube-api-access-4hgqz\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:28.981117 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:28.981088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:28.981284 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.981241 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:28.981284 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.981260 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:28.981397 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:28.981320 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.981296197 +0000 UTC m=+35.122014917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:29.082033 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.081999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:29.082187 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.082042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:48:29.082187 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.082104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:29.082187 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082154 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:29.082337 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082236 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:30.082214771 +0000 UTC m=+35.222933496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:29.082337 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082235 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:29.082337 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082275 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:30.082264373 +0000 UTC m=+35.222983093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:29.082337 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082236 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:48:29.082337 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.082305 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs podName:f7e26d85-638f-42c1-9b32-67320a5cbbe3 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:01.082297747 +0000 UTC m=+66.223016470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs") pod "network-metrics-daemon-x6gbd" (UID: "f7e26d85-638f-42c1-9b32-67320a5cbbe3") : secret "metrics-daemon-secret" not found Apr 16 16:48:29.284265 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.283961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:29.288668 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.287749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dsv\" (UniqueName: \"kubernetes.io/projected/41dd0be2-82c4-4469-b8d9-d1b98a4adb55-kube-api-access-f7dsv\") pod \"network-check-target-md7k7\" (UID: \"41dd0be2-82c4-4469-b8d9-d1b98a4adb55\") " pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:29.336938 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.336912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:29.343005 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.342981 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx"] Apr 16 16:48:29.345504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.345477 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch"] Apr 16 16:48:29.355116 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.355095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps"] Apr 16 16:48:29.495426 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:29.495403 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ad6397_4200_4581_8948_4e5ecae47a04.slice/crio-050d6e8d5c3db4e48e86418d40b36adaa8be127ab803eba93d0b55ea16ef1028 WatchSource:0}: Error finding container 050d6e8d5c3db4e48e86418d40b36adaa8be127ab803eba93d0b55ea16ef1028: Status 404 returned error can't find the container with id 050d6e8d5c3db4e48e86418d40b36adaa8be127ab803eba93d0b55ea16ef1028 Apr 16 16:48:29.496216 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:29.496094 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536018da_d22a_4d5a_a54d_4d2116c68151.slice/crio-67c86aaadd4c68e718e2b82b3740ea7dc03b12819fbdae9bac6ed319aae27185 WatchSource:0}: Error finding container 67c86aaadd4c68e718e2b82b3740ea7dc03b12819fbdae9bac6ed319aae27185: Status 404 returned error can't find the container with id 67c86aaadd4c68e718e2b82b3740ea7dc03b12819fbdae9bac6ed319aae27185 Apr 16 16:48:29.496744 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:29.496723 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca9f9f4_bcc1_439f_8f7f_3c133fadff5f.slice/crio-3bb273aeed5be34226bda6de9dc0012d3b004e012226e4b01daa638768fb2b80 WatchSource:0}: Error finding container 3bb273aeed5be34226bda6de9dc0012d3b004e012226e4b01daa638768fb2b80: Status 404 returned error can't find the container with id 3bb273aeed5be34226bda6de9dc0012d3b004e012226e4b01daa638768fb2b80 Apr 16 16:48:29.621823 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.621776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" event={"ID":"36ad6397-4200-4581-8948-4e5ecae47a04","Type":"ContainerStarted","Data":"050d6e8d5c3db4e48e86418d40b36adaa8be127ab803eba93d0b55ea16ef1028"} Apr 16 16:48:29.623071 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.623004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" event={"ID":"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f","Type":"ContainerStarted","Data":"3bb273aeed5be34226bda6de9dc0012d3b004e012226e4b01daa638768fb2b80"} Apr 16 16:48:29.624258 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.624227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerStarted","Data":"67c86aaadd4c68e718e2b82b3740ea7dc03b12819fbdae9bac6ed319aae27185"} Apr 16 16:48:29.660044 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.660020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-md7k7"] Apr 16 16:48:29.668354 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:29.668333 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41dd0be2_82c4_4469_b8d9_d1b98a4adb55.slice/crio-82d1730c9a688193fe9a9cebee2f56e1c0f744b13760c7ab2d21e6ff1a0bb5e6 WatchSource:0}: Error finding container 82d1730c9a688193fe9a9cebee2f56e1c0f744b13760c7ab2d21e6ff1a0bb5e6: Status 404 returned error can't find the container with id 82d1730c9a688193fe9a9cebee2f56e1c0f744b13760c7ab2d21e6ff1a0bb5e6 Apr 16 16:48:29.991771 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:29.991744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:29.991954 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.991887 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:29.991954 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.991905 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:29.992048 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:29.991959 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:31.991944127 +0000 UTC m=+37.132662851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:30.092099 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:30.092033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:30.092204 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:30.092107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:30.092204 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:30.092186 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:30.092277 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:30.092257 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:30.092307 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:30.092274 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:32.09225955 +0000 UTC m=+37.232978269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:30.092359 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:30.092315 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:32.092297602 +0000 UTC m=+37.233016330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:30.631875 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:30.631805 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-md7k7" event={"ID":"41dd0be2-82c4-4469-b8d9-d1b98a4adb55","Type":"ContainerStarted","Data":"82d1730c9a688193fe9a9cebee2f56e1c0f744b13760c7ab2d21e6ff1a0bb5e6"} Apr 16 16:48:30.638091 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:30.637214 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="ac2f56ec0b7c4f7011daa5a222f254c941ab852c2a6389d81049e0b1cc4a9395" exitCode=0 Apr 16 16:48:30.638091 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:30.637266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"ac2f56ec0b7c4f7011daa5a222f254c941ab852c2a6389d81049e0b1cc4a9395"} Apr 16 16:48:31.654197 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:31.653086 2573 generic.go:358] "Generic (PLEG): container finished" podID="50367a6c-7164-45c2-b2f1-af3375aa5768" containerID="6e74b32eef40c5eeca9d58ea57ff7a90baf5ccff9e50f855a90ced2a3d1b8bfe" exitCode=0 Apr 16 16:48:31.654197 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:31.653351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerDied","Data":"6e74b32eef40c5eeca9d58ea57ff7a90baf5ccff9e50f855a90ced2a3d1b8bfe"} Apr 16 16:48:32.008832 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:32.008799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:32.008985 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.008944 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:32.008985 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.008959 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:32.009090 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.009014 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.008994778 +0000 UTC m=+41.149713520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:32.109492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:32.109581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.109708 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.109768 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.109749504 +0000 UTC m=+41.250468238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.110159 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:32.110243 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:32.110209 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:36.110193853 +0000 UTC m=+41.250912579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:32.659522 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:32.659487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" event={"ID":"50367a6c-7164-45c2-b2f1-af3375aa5768","Type":"ContainerStarted","Data":"f3073ae3fb98e68e26e3fc3eb2e4baba4048d32d31e95c31138dd571611743b8"} Apr 16 16:48:32.683773 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:32.683464 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t7jrv" podStartSLOduration=6.118599587 podStartE2EDuration="37.683448182s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:47:57.976125345 +0000 UTC m=+3.116844069" lastFinishedPulling="2026-04-16 16:48:29.540973944 +0000 UTC m=+34.681692664" observedRunningTime="2026-04-16 16:48:32.681576274 +0000 UTC m=+37.822295017" watchObservedRunningTime="2026-04-16 16:48:32.683448182 +0000 UTC m=+37.824166926" Apr 16 16:48:33.235289 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.235257 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578"] Apr 16 16:48:33.246527 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.246507 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx"] Apr 16 16:48:33.246726 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.246705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.249737 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.249709 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wk772\"" Apr 16 16:48:33.249846 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.249714 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.254560 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.254364 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:48:33.255735 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.255678 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.261027 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.260984 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5887855b54-djpw5"] Apr 16 16:48:33.261149 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.261121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" Apr 16 16:48:33.264934 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.264914 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4wl98\"" Apr 16 16:48:33.265094 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.264947 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.265186 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.264965 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.277195 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.277178 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jgsfb"] Apr 16 16:48:33.277320 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.277306 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.281035 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281017 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:48:33.281190 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:48:33.281255 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:48:33.281480 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281465 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.281683 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281667 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.281830 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281811 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:48:33.281997 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.281947 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-q249j\"" Apr 16 16:48:33.291023 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.290982 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx"] Apr 16 16:48:33.291023 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.291006 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578"] Apr 16 16:48:33.291023 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.291020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jgsfb"] Apr 16 16:48:33.291350 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.291032 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5887855b54-djpw5"] Apr 16 16:48:33.291350 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.291179 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.293827 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.293810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.294125 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.294103 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:48:33.294429 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.294409 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.294574 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.294539 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:48:33.294810 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.294793 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-sjndb\"" Apr 16 16:48:33.301272 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.301254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:48:33.335325 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.335300 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm"] Apr 16 16:48:33.347941 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.347922 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx"] Apr 16 16:48:33.348108 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.348087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" Apr 16 16:48:33.351040 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.351018 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z"] Apr 16 16:48:33.351224 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.351203 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-kjbzf\"" Apr 16 16:48:33.351577 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.351557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.354362 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.354344 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576"] Apr 16 16:48:33.354847 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.354819 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hxttb\"" Apr 16 16:48:33.355086 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.355032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:48:33.356094 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.356069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:48:33.358684 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.357246 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74"] Apr 16 16:48:33.358684 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.357735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.358684 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.358051 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.360936 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.360917 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm"] Apr 16 16:48:33.361045 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.360942 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx"] Apr 16 16:48:33.361045 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.360959 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74"] Apr 16 16:48:33.361153 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.361099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.361751 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.361732 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:48:33.362138 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.362113 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.362579 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.362528 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-h4ng5\"" Apr 16 16:48:33.362669 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.362618 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.363478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.363456 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.364440 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.364420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:48:33.364590 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.364570 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rjlj5\"" Apr 16 16:48:33.364845 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.364827 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:48:33.367510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.365409 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.367510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.366201 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.367510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.366297 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:48:33.367510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.367106 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.368255 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.368232 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z"] Apr 16 16:48:33.369930 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.369910 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:48:33.370288 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.370097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9przc\"" Apr 16 16:48:33.370288 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.370126 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:48:33.372460 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.372441 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576"] Apr 16 16:48:33.420894 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.420866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-stats-auth\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.421363 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-trusted-ca\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.421475 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.421475 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgf7j\" (UniqueName: \"kubernetes.io/projected/682aafdb-c596-4a7b-8112-c6c867ff770e-kube-api-access-mgf7j\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.421573 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfcn\" (UniqueName: \"kubernetes.io/projected/96edca93-47fe-432e-86c9-b734c62b1424-kube-api-access-pnfcn\") pod \"volume-data-source-validator-7d955d5dd4-2gpsx\" (UID: \"96edca93-47fe-432e-86c9-b734c62b1424\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" Apr 16 16:48:33.421625 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-default-certificate\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.421625 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c99cce0-b27a-481f-8825-9d205581b7d0-serving-cert\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.421729 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc76z\" (UniqueName: \"kubernetes.io/projected/6c99cce0-b27a-481f-8825-9d205581b7d0-kube-api-access-xc76z\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.421729 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9njwk\" (UniqueName: \"kubernetes.io/projected/8af394bc-025e-4545-801f-0e6309febaa3-kube-api-access-9njwk\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.421833 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.421833 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.421932 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.421842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-config\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.436808 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.436784 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-x9csm"] Apr 16 16:48:33.451307 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.451285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.455122 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.455102 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-x9csm"] Apr 16 16:48:33.455985 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.455940 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:48:33.455985 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.455977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:48:33.456135 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.456068 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-78ljj\"" Apr 16 16:48:33.456532 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.456322 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:48:33.456622 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.456556 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:48:33.461179 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.461156 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:48:33.522478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-stats-auth\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.522478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-trusted-ca\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.522639 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.522639 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgf7j\" (UniqueName: \"kubernetes.io/projected/682aafdb-c596-4a7b-8112-c6c867ff770e-kube-api-access-mgf7j\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.522639 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.522639 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.522629 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:34.022607066 +0000 UTC m=+39.163325787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfcn\" (UniqueName: \"kubernetes.io/projected/96edca93-47fe-432e-86c9-b734c62b1424-kube-api-access-pnfcn\") pod \"volume-data-source-validator-7d955d5dd4-2gpsx\" (UID: \"96edca93-47fe-432e-86c9-b734c62b1424\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b32d91f-2c9d-4d71-b910-066e212015e3-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-default-certificate\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zglx\" (UniqueName: \"kubernetes.io/projected/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-kube-api-access-8zglx\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c99cce0-b27a-481f-8825-9d205581b7d0-serving-cert\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.522843 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc76z\" (UniqueName: \"kubernetes.io/projected/6c99cce0-b27a-481f-8825-9d205581b7d0-kube-api-access-xc76z\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7bc7dff0-16af-4031-a829-4427a2699284-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9njwk\" (UniqueName: \"kubernetes.io/projected/8af394bc-025e-4545-801f-0e6309febaa3-kube-api-access-9njwk\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-serving-cert\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.522978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszcd\" (UniqueName: \"kubernetes.io/projected/7bc7dff0-16af-4031-a829-4427a2699284-kube-api-access-tszcd\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn84w\" (UniqueName: \"kubernetes.io/projected/533755f5-7620-43ef-aa9a-be97a74e8866-kube-api-access-fn84w\") pod \"network-check-source-7b678d77c7-2sdvm\" (UID: \"533755f5-7620-43ef-aa9a-be97a74e8866\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-config\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.523194 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-config\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523207 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b32d91f-2c9d-4d71-b910-066e212015e3-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxr9\" (UniqueName: \"kubernetes.io/projected/4b32d91f-2c9d-4d71-b910-066e212015e3-kube-api-access-pnxr9\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.523372 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-trusted-ca\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.523457 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:34.023439536 +0000 UTC m=+39.164158277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.523399 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:33.523694 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.523515 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:34.023498845 +0000 UTC m=+39.164217578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:33.523995 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.523958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99cce0-b27a-481f-8825-9d205581b7d0-config\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.526683 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.526659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c99cce0-b27a-481f-8825-9d205581b7d0-serving-cert\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.526818 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.526802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-default-certificate\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.526883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.526836 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-stats-auth\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.533031 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.532446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9njwk\" (UniqueName: \"kubernetes.io/projected/8af394bc-025e-4545-801f-0e6309febaa3-kube-api-access-9njwk\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:33.533031 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.532987 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgf7j\" (UniqueName: \"kubernetes.io/projected/682aafdb-c596-4a7b-8112-c6c867ff770e-kube-api-access-mgf7j\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:33.534970 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.534685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfcn\" (UniqueName: \"kubernetes.io/projected/96edca93-47fe-432e-86c9-b734c62b1424-kube-api-access-pnfcn\") pod \"volume-data-source-validator-7d955d5dd4-2gpsx\" (UID: \"96edca93-47fe-432e-86c9-b734c62b1424\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" Apr 16 16:48:33.541534 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.541516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc76z\" (UniqueName: \"kubernetes.io/projected/6c99cce0-b27a-481f-8825-9d205581b7d0-kube-api-access-xc76z\") pod \"console-operator-d87b8d5fc-jgsfb\" (UID: \"6c99cce0-b27a-481f-8825-9d205581b7d0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.572736 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.572709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" Apr 16 16:48:33.600512 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.600492 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:33.624735 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.624822 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b32d91f-2c9d-4d71-b910-066e212015e3-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.624822 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxr9\" (UniqueName: \"kubernetes.io/projected/4b32d91f-2c9d-4d71-b910-066e212015e3-kube-api-access-pnxr9\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.624932 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.624932 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b32d91f-2c9d-4d71-b910-066e212015e3-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.625024 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zglx\" (UniqueName: \"kubernetes.io/projected/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-kube-api-access-8zglx\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.625024 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.624980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-snapshots\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625024 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-serving-cert\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625164 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.625047 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:33.625164 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7bc7dff0-16af-4031-a829-4427a2699284-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.625164 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf5jb\" (UniqueName: \"kubernetes.io/projected/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-kube-api-access-sf5jb\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625164 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.625106 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:34.125087836 +0000 UTC m=+39.265806560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:33.625372 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b32d91f-2c9d-4d71-b910-066e212015e3-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.625372 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-serving-cert\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.625372 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-tmp\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625437 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tszcd\" (UniqueName: \"kubernetes.io/projected/7bc7dff0-16af-4031-a829-4427a2699284-kube-api-access-tszcd\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.625466 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625537 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:33.625524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:34.125510529 +0000 UTC m=+39.266229256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:33.625791 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn84w\" (UniqueName: \"kubernetes.io/projected/533755f5-7620-43ef-aa9a-be97a74e8866-kube-api-access-fn84w\") pod \"network-check-source-7b678d77c7-2sdvm\" (UID: \"533755f5-7620-43ef-aa9a-be97a74e8866\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" Apr 16 16:48:33.625791 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-config\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.625791 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.625947 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.625868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7bc7dff0-16af-4031-a829-4427a2699284-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.626131 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.626106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-config\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.627478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.627455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b32d91f-2c9d-4d71-b910-066e212015e3-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.627736 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.627717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-serving-cert\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.633511 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.633487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszcd\" (UniqueName: \"kubernetes.io/projected/7bc7dff0-16af-4031-a829-4427a2699284-kube-api-access-tszcd\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:33.633980 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.633956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn84w\" (UniqueName: \"kubernetes.io/projected/533755f5-7620-43ef-aa9a-be97a74e8866-kube-api-access-fn84w\") pod \"network-check-source-7b678d77c7-2sdvm\" (UID: \"533755f5-7620-43ef-aa9a-be97a74e8866\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" Apr 16 16:48:33.634440 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.634361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxr9\" (UniqueName: \"kubernetes.io/projected/4b32d91f-2c9d-4d71-b910-066e212015e3-kube-api-access-pnxr9\") pod \"kube-storage-version-migrator-operator-756bb7d76f-7w64z\" (UID: \"4b32d91f-2c9d-4d71-b910-066e212015e3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.634519 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.634480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zglx\" (UniqueName: \"kubernetes.io/projected/2b75e55f-5bdd-4cbb-abd0-69be2a62852e-kube-api-access-8zglx\") pod \"service-ca-operator-69965bb79d-7wg74\" (UID: \"2b75e55f-5bdd-4cbb-abd0-69be2a62852e\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.661921 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.661903 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" Apr 16 16:48:33.682450 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.682429 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" Apr 16 16:48:33.698143 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.698116 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" Apr 16 16:48:33.726010 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.725990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-tmp\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.726102 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.726027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.726102 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.726072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.726732 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.726706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-tmp\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.726997 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.726976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.727158 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.727126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-snapshots\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.727269 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.727182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-serving-cert\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.727269 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.727248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sf5jb\" (UniqueName: \"kubernetes.io/projected/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-kube-api-access-sf5jb\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.727941 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.727919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-snapshots\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.730669 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.730106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-serving-cert\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.736083 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.736063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf5jb\" (UniqueName: \"kubernetes.io/projected/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-kube-api-access-sf5jb\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.737283 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.737253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-x9csm\" (UID: \"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2\") " pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:33.761996 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:33.761971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" Apr 16 16:48:34.029763 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:34.029728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:34.029784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:34.029821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.029876 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.029924 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.029940 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.029922887 +0000 UTC m=+40.170641609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.029959 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.029950318 +0000 UTC m=+40.170669041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:34.030004 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.029985 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.029967137 +0000 UTC m=+40.170685857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:34.131593 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:34.131541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:34.131761 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:34.131657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:34.131761 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.131714 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:34.131875 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.131788 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.131771014 +0000 UTC m=+40.272489738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:34.131875 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.131806 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:34.131875 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:34.131864 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.131847036 +0000 UTC m=+40.272565769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:35.039316 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.039281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.039328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.039456 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.039480 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.039534 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.039515079 +0000 UTC m=+42.180233799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.039578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.039602 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.039577546 +0000 UTC m=+42.180296281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:35.039739 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.039650 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.039639033 +0000 UTC m=+42.180357754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:35.140043 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.140012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:35.140193 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.140147 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:35.140193 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.140166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:35.140288 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.140203 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.140183582 +0000 UTC m=+42.280902310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:35.140288 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.140257 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:35.140364 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:35.140310 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:37.140294196 +0000 UTC m=+42.281012923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:35.644371 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.644341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:35.646478 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.646451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/59cc8b51-5a0b-45ea-8e53-f1473e78b939-original-pull-secret\") pod \"global-pull-secret-syncer-4zxj2\" (UID: \"59cc8b51-5a0b-45ea-8e53-f1473e78b939\") " pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:35.941796 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:35.941702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4zxj2" Apr 16 16:48:36.048161 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:36.048122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:36.048560 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.048249 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:36.048560 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.048265 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:36.048560 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.048321 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:48:44.048306139 +0000 UTC m=+49.189024859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:36.148650 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:36.148616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:36.148850 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:36.148772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:36.148850 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.148774 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:36.148850 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.148827 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:36.148964 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.148875 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:44.14885947 +0000 UTC m=+49.289578190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:36.148964 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:36.148892 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:44.148883833 +0000 UTC m=+49.289602554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:37.056713 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.056677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:37.056713 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.056716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.056745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.056826 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.056826 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.056834 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.056821937 +0000 UTC m=+46.197540656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.056967 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.056956145 +0000 UTC m=+46.197674865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:37.057151 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.056978 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.056972459 +0000 UTC m=+46.197691178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:37.157822 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.157796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:37.157955 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.157854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:37.157955 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.157936 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:37.158075 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.157951 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:37.158075 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.157971 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.157960269 +0000 UTC m=+46.298678988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:37.158075 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:37.158013 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.157995031 +0000 UTC m=+46.298713760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:37.667503 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.667180 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm"] Apr 16 16:48:37.679440 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.679410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z"] Apr 16 16:48:37.697960 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.697701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" event={"ID":"5ca9f9f4-bcc1-439f-8f7f-3c133fadff5f","Type":"ContainerStarted","Data":"973b31421901e868405166b14e0a260bfd82ddc97fbdc1ff66320ca0db7088e7"} Apr 16 16:48:37.699786 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.699726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerStarted","Data":"8685ce792fd7420cb3e5e14741159fac1c136ebb3be88162b143b1b93d1963ef"} Apr 16 16:48:37.711562 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.707873 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:37.711562 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.708201 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:48:37.711936 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.711918 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" Apr 16 16:48:37.729628 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.728704 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8547679d7d-25lps" podStartSLOduration=3.791341145 podStartE2EDuration="11.728687308s" podCreationTimestamp="2026-04-16 16:48:26 +0000 UTC" firstStartedPulling="2026-04-16 16:48:29.515805858 +0000 UTC m=+34.656524578" lastFinishedPulling="2026-04-16 16:48:37.453152017 +0000 UTC m=+42.593870741" observedRunningTime="2026-04-16 16:48:37.726201985 +0000 UTC m=+42.866920727" watchObservedRunningTime="2026-04-16 16:48:37.728687308 +0000 UTC m=+42.869406050" Apr 16 16:48:37.740917 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.740871 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-md7k7" podStartSLOduration=34.973087375 podStartE2EDuration="42.740857203s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:48:29.670261632 +0000 UTC m=+34.810980351" lastFinishedPulling="2026-04-16 16:48:37.438031447 +0000 UTC m=+42.578750179" observedRunningTime="2026-04-16 16:48:37.740406262 +0000 UTC m=+42.881125005" watchObservedRunningTime="2026-04-16 16:48:37.740857203 +0000 UTC m=+42.881575948" Apr 16 16:48:37.869967 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.869897 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jgsfb"] Apr 16 16:48:37.872533 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:37.872509 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c99cce0_b27a_481f_8825_9d205581b7d0.slice/crio-b697e59920363055a42a09978a477e43a18b130bf9ecaf5d7d97b6b2b5cce9b3 WatchSource:0}: Error finding container b697e59920363055a42a09978a477e43a18b130bf9ecaf5d7d97b6b2b5cce9b3: Status 404 returned error can't find the container with id b697e59920363055a42a09978a477e43a18b130bf9ecaf5d7d97b6b2b5cce9b3 Apr 16 16:48:37.924193 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.923939 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74"] Apr 16 16:48:37.926340 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.926317 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4zxj2"] Apr 16 16:48:37.928742 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:37.928714 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b75e55f_5bdd_4cbb_abd0_69be2a62852e.slice/crio-75e8b013502b0fac72353e08f198811cfd138a87d76848066b77a5d1bda21482 WatchSource:0}: Error finding container 75e8b013502b0fac72353e08f198811cfd138a87d76848066b77a5d1bda21482: Status 404 returned error can't find the container with id 75e8b013502b0fac72353e08f198811cfd138a87d76848066b77a5d1bda21482 Apr 16 16:48:37.929062 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.928949 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx"] Apr 16 16:48:37.929273 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:37.929251 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cc8b51_5a0b_45ea_8e53_f1473e78b939.slice/crio-c19424c7361c4123a35d1956a24fe43060c377da233e4f2bf4b1292eb02a2690 WatchSource:0}: Error finding container c19424c7361c4123a35d1956a24fe43060c377da233e4f2bf4b1292eb02a2690: Status 404 returned error can't find the container with id c19424c7361c4123a35d1956a24fe43060c377da233e4f2bf4b1292eb02a2690 Apr 16 16:48:37.931362 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:37.931339 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-x9csm"] Apr 16 16:48:37.932066 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:37.931929 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96edca93_47fe_432e_86c9_b734c62b1424.slice/crio-310929cd4c18272dd8f4ba5f1acd8f4c812c5a180a94e75f08721ef446e0701b WatchSource:0}: Error finding container 310929cd4c18272dd8f4ba5f1acd8f4c812c5a180a94e75f08721ef446e0701b: Status 404 returned error can't find the container with id 310929cd4c18272dd8f4ba5f1acd8f4c812c5a180a94e75f08721ef446e0701b Apr 16 16:48:37.934411 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:48:37.934334 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebea9173_b1aa_4ef6_a1bb_6b7483b70fd2.slice/crio-9ce5e028f1b43c375954d7f4806dbbf2517a9d75f4a64049eeb95859cde41502 WatchSource:0}: Error finding container 9ce5e028f1b43c375954d7f4806dbbf2517a9d75f4a64049eeb95859cde41502: Status 404 returned error can't find the container with id 9ce5e028f1b43c375954d7f4806dbbf2517a9d75f4a64049eeb95859cde41502 Apr 16 16:48:38.714985 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.714216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" event={"ID":"36ad6397-4200-4581-8948-4e5ecae47a04","Type":"ContainerStarted","Data":"5b023cbc1a70d6642a1f79bf12b14731e8bc613eb30dc8e0a3c4f09a22b3eec8"} Apr 16 16:48:38.718019 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.717979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" event={"ID":"533755f5-7620-43ef-aa9a-be97a74e8866","Type":"ContainerStarted","Data":"8dabf52a663415969ddffcd1c79d2f46d7f63d00ed080c12b8df5192534e9c99"} Apr 16 16:48:38.718019 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.718013 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" event={"ID":"533755f5-7620-43ef-aa9a-be97a74e8866","Type":"ContainerStarted","Data":"d0a4f2da388e749edbeaf819e674b98fe1c43cacae8e3f7b9481fa87408c53b9"} Apr 16 16:48:38.720667 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.720621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" event={"ID":"6c99cce0-b27a-481f-8825-9d205581b7d0","Type":"ContainerStarted","Data":"b697e59920363055a42a09978a477e43a18b130bf9ecaf5d7d97b6b2b5cce9b3"} Apr 16 16:48:38.742011 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.740873 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdff685c-x7jtx" podStartSLOduration=4.866168214 podStartE2EDuration="12.740858278s" podCreationTimestamp="2026-04-16 16:48:26 +0000 UTC" firstStartedPulling="2026-04-16 16:48:29.515957644 +0000 UTC m=+34.656676372" lastFinishedPulling="2026-04-16 16:48:37.3906477 +0000 UTC m=+42.531366436" observedRunningTime="2026-04-16 16:48:38.740302622 +0000 UTC m=+43.881021364" watchObservedRunningTime="2026-04-16 16:48:38.740858278 +0000 UTC m=+43.881577020" Apr 16 16:48:38.742342 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.742298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4zxj2" event={"ID":"59cc8b51-5a0b-45ea-8e53-f1473e78b939","Type":"ContainerStarted","Data":"c19424c7361c4123a35d1956a24fe43060c377da233e4f2bf4b1292eb02a2690"} Apr 16 16:48:38.745712 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.745659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" event={"ID":"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2","Type":"ContainerStarted","Data":"9ce5e028f1b43c375954d7f4806dbbf2517a9d75f4a64049eeb95859cde41502"} Apr 16 16:48:38.747669 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.747623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" event={"ID":"96edca93-47fe-432e-86c9-b734c62b1424","Type":"ContainerStarted","Data":"310929cd4c18272dd8f4ba5f1acd8f4c812c5a180a94e75f08721ef446e0701b"} Apr 16 16:48:38.749972 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.749930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" event={"ID":"4b32d91f-2c9d-4d71-b910-066e212015e3","Type":"ContainerStarted","Data":"f3a33303f15730cac54f44961abc2ecddb11e54c3fa5e56879ab5c971f33fae4"} Apr 16 16:48:38.754242 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.753750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-md7k7" event={"ID":"41dd0be2-82c4-4469-b8d9-d1b98a4adb55","Type":"ContainerStarted","Data":"8b66261f074b436f639994b1f4baed06906ca095b7a113b6967e3565771ad76f"} Apr 16 16:48:38.758991 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:38.758945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" event={"ID":"2b75e55f-5bdd-4cbb-abd0-69be2a62852e","Type":"ContainerStarted","Data":"75e8b013502b0fac72353e08f198811cfd138a87d76848066b77a5d1bda21482"} Apr 16 16:48:41.099531 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:41.099499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:41.099542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:41.099594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.099673 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.099704 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.099746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.099727397 +0000 UTC m=+54.240446125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.099761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.099755014 +0000 UTC m=+54.240473733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:41.099946 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.099798 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.09978166 +0000 UTC m=+54.240500387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:41.200102 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:41.200070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:41.200271 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.200228 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:41.200271 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:41.200249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:41.200408 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.200299 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.200278142 +0000 UTC m=+54.340996868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:41.200408 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.200342 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:41.200408 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:41.200407 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.20037467 +0000 UTC m=+54.341093394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:44.127104 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:44.127067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:48:44.127589 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.127230 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:44.127589 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.127249 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558df76499-sw5fx: secret "image-registry-tls" not found Apr 16 16:48:44.127589 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.127308 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls podName:d030f4e9-69ab-40db-8fd6-d2f53d467cdb nodeName:}" failed. No retries permitted until 2026-04-16 16:49:00.127291107 +0000 UTC m=+65.268009844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls") pod "image-registry-558df76499-sw5fx" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb") : secret "image-registry-tls" not found Apr 16 16:48:44.228334 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:44.228311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:48:44.228449 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:44.228352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:48:44.228505 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.228484 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:44.228551 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.228516 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:44.228589 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.228552 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls podName:db86a360-38b7-4c87-ac77-176127220106 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:00.228533289 +0000 UTC m=+65.369252026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls") pod "dns-default-lhgcr" (UID: "db86a360-38b7-4c87-ac77-176127220106") : secret "dns-default-metrics-tls" not found Apr 16 16:48:44.228589 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:44.228569 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert podName:40e02308-a3e4-43c3-8e6d-b59cfe039143 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:00.228561339 +0000 UTC m=+65.369280060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert") pod "ingress-canary-zt96l" (UID: "40e02308-a3e4-43c3-8e6d-b59cfe039143") : secret "canary-serving-cert" not found Apr 16 16:48:45.439769 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:45.439694 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-2sdvm" podStartSLOduration=12.439679792 podStartE2EDuration="12.439679792s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:38.75811309 +0000 UTC m=+43.898831836" watchObservedRunningTime="2026-04-16 16:48:45.439679792 +0000 UTC m=+50.580398576" Apr 16 16:48:46.783192 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.783156 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerStarted","Data":"1d2dfc69e221ffc42e33433e95184da0c0e21d6fcf12c3a5b32dc425e8f7192c"} Apr 16 16:48:46.783757 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.783728 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerStarted","Data":"75b16bf4dd6641ca10d25ba23a23f511667f5918ed6621811c1e0f7899853230"} Apr 16 16:48:46.784664 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.784637 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" event={"ID":"2b75e55f-5bdd-4cbb-abd0-69be2a62852e","Type":"ContainerStarted","Data":"43d9b63ef5f4f55f1909d8cb9e0976e537cf9d7feaf9ac6c48d640644787c783"} Apr 16 16:48:46.786119 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.786099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/0.log" Apr 16 16:48:46.786201 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.786135 2573 generic.go:358] "Generic (PLEG): container finished" podID="6c99cce0-b27a-481f-8825-9d205581b7d0" containerID="b72c0925ad5d4c651afde1e8e9f4d8071e8debf66c0b0442b6c0dcb157db2983" exitCode=255 Apr 16 16:48:46.786259 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.786195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" event={"ID":"6c99cce0-b27a-481f-8825-9d205581b7d0","Type":"ContainerDied","Data":"b72c0925ad5d4c651afde1e8e9f4d8071e8debf66c0b0442b6c0dcb157db2983"} Apr 16 16:48:46.786439 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.786421 2573 scope.go:117] "RemoveContainer" containerID="b72c0925ad5d4c651afde1e8e9f4d8071e8debf66c0b0442b6c0dcb157db2983" Apr 16 16:48:46.788032 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.788002 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4zxj2" event={"ID":"59cc8b51-5a0b-45ea-8e53-f1473e78b939","Type":"ContainerStarted","Data":"f527c2ffd4bcc98dd23f17f7ae83995e269e67bc403d4975830f4280ad67929b"} Apr 16 16:48:46.789438 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.789402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" event={"ID":"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2","Type":"ContainerStarted","Data":"abadd7c229b0d765f3a8054c53964b693d8a58011846c784178f09d8877efb03"} Apr 16 16:48:46.790764 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.790736 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" event={"ID":"96edca93-47fe-432e-86c9-b734c62b1424","Type":"ContainerStarted","Data":"04ac6da4f4b9b5cd5c4e3284ae19a05c77bde1db479c5461c1a221222ff7f78a"} Apr 16 16:48:46.792048 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.792018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" event={"ID":"4b32d91f-2c9d-4d71-b910-066e212015e3","Type":"ContainerStarted","Data":"39ee3dd74a37f13140090dcc159208e056ff3eace72f9a0a82e5636090c50654"} Apr 16 16:48:46.801237 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.801189 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" podStartSLOduration=4.417646035 podStartE2EDuration="20.801176443s" podCreationTimestamp="2026-04-16 16:48:26 +0000 UTC" firstStartedPulling="2026-04-16 16:48:29.515914734 +0000 UTC m=+34.656633454" lastFinishedPulling="2026-04-16 16:48:45.899445127 +0000 UTC m=+51.040163862" observedRunningTime="2026-04-16 16:48:46.800840733 +0000 UTC m=+51.941559472" watchObservedRunningTime="2026-04-16 16:48:46.801176443 +0000 UTC m=+51.941895188" Apr 16 16:48:46.821729 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.821680 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-2gpsx" podStartSLOduration=5.900680958 podStartE2EDuration="13.821665591s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.934075514 +0000 UTC m=+43.074794233" lastFinishedPulling="2026-04-16 16:48:45.855060132 +0000 UTC m=+50.995778866" observedRunningTime="2026-04-16 16:48:46.820862126 +0000 UTC m=+51.961580873" watchObservedRunningTime="2026-04-16 16:48:46.821665591 +0000 UTC m=+51.962384531" Apr 16 16:48:46.841725 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.841688 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4zxj2" podStartSLOduration=35.657183254 podStartE2EDuration="43.841679333s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.931580074 +0000 UTC m=+43.072298795" lastFinishedPulling="2026-04-16 16:48:46.116076153 +0000 UTC m=+51.256794874" observedRunningTime="2026-04-16 16:48:46.840774347 +0000 UTC m=+51.981493090" watchObservedRunningTime="2026-04-16 16:48:46.841679333 +0000 UTC m=+51.982398073" Apr 16 16:48:46.856890 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.856857 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" podStartSLOduration=5.893474051 podStartE2EDuration="13.856847601s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.936399719 +0000 UTC m=+43.077118439" lastFinishedPulling="2026-04-16 16:48:45.899773254 +0000 UTC m=+51.040491989" observedRunningTime="2026-04-16 16:48:46.856370717 +0000 UTC m=+51.997089459" watchObservedRunningTime="2026-04-16 16:48:46.856847601 +0000 UTC m=+51.997566339" Apr 16 16:48:46.890236 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:46.890161 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" podStartSLOduration=5.916013368 podStartE2EDuration="13.890146446s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.931579377 +0000 UTC m=+43.072298096" lastFinishedPulling="2026-04-16 16:48:45.905712441 +0000 UTC m=+51.046431174" observedRunningTime="2026-04-16 16:48:46.887892901 +0000 UTC m=+52.028611645" watchObservedRunningTime="2026-04-16 16:48:46.890146446 +0000 UTC m=+52.030865196" Apr 16 16:48:47.796492 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.796465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:48:47.796861 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.796799 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/0.log" Apr 16 16:48:47.796861 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.796829 2573 generic.go:358] "Generic (PLEG): container finished" podID="6c99cce0-b27a-481f-8825-9d205581b7d0" containerID="1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a" exitCode=255 Apr 16 16:48:47.797006 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.796959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" event={"ID":"6c99cce0-b27a-481f-8825-9d205581b7d0","Type":"ContainerDied","Data":"1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a"} Apr 16 16:48:47.797096 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.797012 2573 scope.go:117] "RemoveContainer" containerID="b72c0925ad5d4c651afde1e8e9f4d8071e8debf66c0b0442b6c0dcb157db2983" Apr 16 16:48:47.797153 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.797105 2573 scope.go:117] "RemoveContainer" containerID="1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a" Apr 16 16:48:47.797320 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:47.797302 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jgsfb_openshift-console-operator(6c99cce0-b27a-481f-8825-9d205581b7d0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" podUID="6c99cce0-b27a-481f-8825-9d205581b7d0" Apr 16 16:48:47.818890 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:47.818852 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" podStartSLOduration=6.60951232 podStartE2EDuration="14.818841961s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.69067766 +0000 UTC m=+42.831396385" lastFinishedPulling="2026-04-16 16:48:45.900007305 +0000 UTC m=+51.040726026" observedRunningTime="2026-04-16 16:48:46.924341279 +0000 UTC m=+52.065060022" watchObservedRunningTime="2026-04-16 16:48:47.818841961 +0000 UTC m=+52.959560703" Apr 16 16:48:48.802803 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:48.802768 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:48:48.803304 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:48.803229 2573 scope.go:117] "RemoveContainer" containerID="1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a" Apr 16 16:48:48.803496 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:48.803472 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jgsfb_openshift-console-operator(6c99cce0-b27a-481f-8825-9d205581b7d0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" podUID="6c99cce0-b27a-481f-8825-9d205581b7d0" Apr 16 16:48:49.172373 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.172310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:49.172510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.172435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:48:49.172510 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.172462 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:49:05.172447792 +0000 UTC m=+70.313166516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : configmap references non-existent config key: service-ca.crt Apr 16 16:48:49.172510 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.172486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:48:49.172510 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.172498 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:48:49.172728 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.172533 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls podName:8af394bc-025e-4545-801f-0e6309febaa3 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:05.172522552 +0000 UTC m=+70.313241272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls") pod "cluster-samples-operator-667775844f-xr578" (UID: "8af394bc-025e-4545-801f-0e6309febaa3") : secret "samples-operator-tls" not found Apr 16 16:48:49.172728 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.172570 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:48:49.172728 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.172601 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs podName:682aafdb-c596-4a7b-8112-c6c867ff770e nodeName:}" failed. No retries permitted until 2026-04-16 16:49:05.172589689 +0000 UTC m=+70.313308438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs") pod "router-default-5887855b54-djpw5" (UID: "682aafdb-c596-4a7b-8112-c6c867ff770e") : secret "router-metrics-certs-default" not found Apr 16 16:48:49.272840 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.272814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:48:49.272928 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.272868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:48:49.272986 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.272936 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:49.272986 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.272942 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:49.272986 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.272971 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls podName:7bc7dff0-16af-4031-a829-4427a2699284 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:05.272961269 +0000 UTC m=+70.413679990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xn576" (UID: "7bc7dff0-16af-4031-a829-4427a2699284") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:48:49.273106 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:49.272991 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert podName:363c07d0-bf5c-4368-a3fe-6d5136c2cd22 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:05.272979781 +0000 UTC m=+70.413698501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-9v9hx" (UID: "363c07d0-bf5c-4368-a3fe-6d5136c2cd22") : secret "networking-console-plugin-cert" not found Apr 16 16:48:49.894054 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:49.894028 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c8bgf_38487532-b4be-41ef-a345-cea3cc5a643c/dns-node-resolver/0.log" Apr 16 16:48:50.694558 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:50.694531 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-49l6w_a3cab79a-4ea1-4744-b205-fd85c929391f/node-ca/0.log" Apr 16 16:48:52.296494 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:52.296458 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-7w64z_4b32d91f-2c9d-4d71-b910-066e212015e3/kube-storage-version-migrator-operator/0.log" Apr 16 16:48:53.601413 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:53.601367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:53.601780 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:53.601427 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:48:53.601780 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:53.601745 2573 scope.go:117] "RemoveContainer" containerID="1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a" Apr 16 16:48:53.601914 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:48:53.601896 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jgsfb_openshift-console-operator(6c99cce0-b27a-481f-8825-9d205581b7d0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" podUID="6c99cce0-b27a-481f-8825-9d205581b7d0" Apr 16 16:48:54.621423 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:48:54.621372 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vts2x" Apr 16 16:49:00.161340 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.161301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:00.163721 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.163691 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"image-registry-558df76499-sw5fx\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:00.262432 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.262405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:49:00.262546 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.262444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:49:00.264698 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.264673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db86a360-38b7-4c87-ac77-176127220106-metrics-tls\") pod \"dns-default-lhgcr\" (UID: \"db86a360-38b7-4c87-ac77-176127220106\") " pod="openshift-dns/dns-default-lhgcr" Apr 16 16:49:00.275520 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.275490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40e02308-a3e4-43c3-8e6d-b59cfe039143-cert\") pod \"ingress-canary-zt96l\" (UID: \"40e02308-a3e4-43c3-8e6d-b59cfe039143\") " pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:49:00.346177 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.346155 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lm58j\"" Apr 16 16:49:00.353757 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.353742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:00.426977 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.426719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fwjqf\"" Apr 16 16:49:00.437592 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.431097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gwwfm\"" Apr 16 16:49:00.437745 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.437641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lhgcr" Apr 16 16:49:00.439347 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.439329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zt96l" Apr 16 16:49:00.506625 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.505733 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:49:00.512397 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:00.512350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd030f4e9_69ab_40db_8fd6_d2f53d467cdb.slice/crio-f8b9670725b26fab7121bfc542751791e95756bb2fa6bfb01bfc4f60f9492679 WatchSource:0}: Error finding container f8b9670725b26fab7121bfc542751791e95756bb2fa6bfb01bfc4f60f9492679: Status 404 returned error can't find the container with id f8b9670725b26fab7121bfc542751791e95756bb2fa6bfb01bfc4f60f9492679 Apr 16 16:49:00.567762 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.567734 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lhgcr"] Apr 16 16:49:00.575429 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:00.575401 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb86a360_38b7_4c87_ac77_176127220106.slice/crio-8c85f01ebb989551c17003ada772c5e510b0b9ac85a4be90f9a82b8ef1af7947 WatchSource:0}: Error finding container 8c85f01ebb989551c17003ada772c5e510b0b9ac85a4be90f9a82b8ef1af7947: Status 404 returned error can't find the container with id 8c85f01ebb989551c17003ada772c5e510b0b9ac85a4be90f9a82b8ef1af7947 Apr 16 16:49:00.583397 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.583360 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zt96l"] Apr 16 16:49:00.585611 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:00.585590 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e02308_a3e4_43c3_8e6d_b59cfe039143.slice/crio-abad83d0eefdf33013407d21e205e920676d36809b1298d3f8f4f19c25f5d16c WatchSource:0}: Error finding container abad83d0eefdf33013407d21e205e920676d36809b1298d3f8f4f19c25f5d16c: Status 404 returned error can't find the container with id abad83d0eefdf33013407d21e205e920676d36809b1298d3f8f4f19c25f5d16c Apr 16 16:49:00.834487 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.834452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558df76499-sw5fx" event={"ID":"d030f4e9-69ab-40db-8fd6-d2f53d467cdb","Type":"ContainerStarted","Data":"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269"} Apr 16 16:49:00.834645 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.834494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558df76499-sw5fx" event={"ID":"d030f4e9-69ab-40db-8fd6-d2f53d467cdb","Type":"ContainerStarted","Data":"f8b9670725b26fab7121bfc542751791e95756bb2fa6bfb01bfc4f60f9492679"} Apr 16 16:49:00.834645 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.834581 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:00.835684 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.835663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zt96l" event={"ID":"40e02308-a3e4-43c3-8e6d-b59cfe039143","Type":"ContainerStarted","Data":"abad83d0eefdf33013407d21e205e920676d36809b1298d3f8f4f19c25f5d16c"} Apr 16 16:49:00.836672 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.836638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhgcr" event={"ID":"db86a360-38b7-4c87-ac77-176127220106","Type":"ContainerStarted","Data":"8c85f01ebb989551c17003ada772c5e510b0b9ac85a4be90f9a82b8ef1af7947"} Apr 16 16:49:00.855413 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:00.855364 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-558df76499-sw5fx" podStartSLOduration=65.855353023 podStartE2EDuration="1m5.855353023s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:49:00.854341531 +0000 UTC m=+65.995060274" watchObservedRunningTime="2026-04-16 16:49:00.855353023 +0000 UTC m=+65.996071764" Apr 16 16:49:01.171114 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.171031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:49:01.173821 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.173788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e26d85-638f-42c1-9b32-67320a5cbbe3-metrics-certs\") pod \"network-metrics-daemon-x6gbd\" (UID: \"f7e26d85-638f-42c1-9b32-67320a5cbbe3\") " pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:49:01.431456 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.431373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:49:01.440036 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.439736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6gbd" Apr 16 16:49:01.579623 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.579580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6gbd"] Apr 16 16:49:01.583918 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:01.583886 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e26d85_638f_42c1_9b32_67320a5cbbe3.slice/crio-0e6c87a3071d0fe27e09d5eeb4e602417dbfa9f9de2f294345ab8b4b760e4735 WatchSource:0}: Error finding container 0e6c87a3071d0fe27e09d5eeb4e602417dbfa9f9de2f294345ab8b4b760e4735: Status 404 returned error can't find the container with id 0e6c87a3071d0fe27e09d5eeb4e602417dbfa9f9de2f294345ab8b4b760e4735 Apr 16 16:49:01.842732 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:01.842690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6gbd" event={"ID":"f7e26d85-638f-42c1-9b32-67320a5cbbe3","Type":"ContainerStarted","Data":"0e6c87a3071d0fe27e09d5eeb4e602417dbfa9f9de2f294345ab8b4b760e4735"} Apr 16 16:49:03.852427 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.852393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhgcr" event={"ID":"db86a360-38b7-4c87-ac77-176127220106","Type":"ContainerStarted","Data":"624b2503a4751b45868e48de3d1e03142d563b7b8831e6079fcce8e44452096a"} Apr 16 16:49:03.852820 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.852440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhgcr" event={"ID":"db86a360-38b7-4c87-ac77-176127220106","Type":"ContainerStarted","Data":"5243d0b04ca4a86fd1cac1b243db7ab098c61508175d5e1961782b73b55bfcf5"} Apr 16 16:49:03.852820 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.852537 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lhgcr" Apr 16 16:49:03.854002 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.853978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zt96l" event={"ID":"40e02308-a3e4-43c3-8e6d-b59cfe039143","Type":"ContainerStarted","Data":"72683bb0eec2bf9badce41a3d4570f467cd6b1adb4413cafdb647f2c679b34cb"} Apr 16 16:49:03.870641 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.870588 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lhgcr" podStartSLOduration=33.556516621 podStartE2EDuration="35.870573709s" podCreationTimestamp="2026-04-16 16:48:28 +0000 UTC" firstStartedPulling="2026-04-16 16:49:00.577175519 +0000 UTC m=+65.717894238" lastFinishedPulling="2026-04-16 16:49:02.891232605 +0000 UTC m=+68.031951326" observedRunningTime="2026-04-16 16:49:03.869747156 +0000 UTC m=+69.010465899" watchObservedRunningTime="2026-04-16 16:49:03.870573709 +0000 UTC m=+69.011292453" Apr 16 16:49:03.886358 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:03.886312 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zt96l" podStartSLOduration=33.575263408 podStartE2EDuration="35.886296746s" podCreationTimestamp="2026-04-16 16:48:28 +0000 UTC" firstStartedPulling="2026-04-16 16:49:00.587490689 +0000 UTC m=+65.728209413" lastFinishedPulling="2026-04-16 16:49:02.898524017 +0000 UTC m=+68.039242751" observedRunningTime="2026-04-16 16:49:03.885010734 +0000 UTC m=+69.025729476" watchObservedRunningTime="2026-04-16 16:49:03.886296746 +0000 UTC m=+69.027015489" Apr 16 16:49:04.417401 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.417320 2573 scope.go:117] "RemoveContainer" containerID="1e9d95c085382cac50bf73824a5b14245885670c6791e4417df24029f6ee1a6a" Apr 16 16:49:04.858445 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.858423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:49:04.858886 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.858518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" event={"ID":"6c99cce0-b27a-481f-8825-9d205581b7d0","Type":"ContainerStarted","Data":"7c6beeb9261af884bed2e8fac867b446000729d2b44eef0f0b2f203190ed7361"} Apr 16 16:49:04.858886 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.858839 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:49:04.860341 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.860319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6gbd" event={"ID":"f7e26d85-638f-42c1-9b32-67320a5cbbe3","Type":"ContainerStarted","Data":"72a6384230b983a89f239d2eb88ada385f340fb31000ae960351c60290023724"} Apr 16 16:49:04.860471 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.860349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6gbd" event={"ID":"f7e26d85-638f-42c1-9b32-67320a5cbbe3","Type":"ContainerStarted","Data":"d06ac888c0da0cdc3d43a04f0a7c4712c0ca04242d19944bf4d6e63626a51100"} Apr 16 16:49:04.882620 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.882578 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" podStartSLOduration=23.857190072 podStartE2EDuration="31.882565918s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:48:37.874409101 +0000 UTC m=+43.015127821" lastFinishedPulling="2026-04-16 16:48:45.899784943 +0000 UTC m=+51.040503667" observedRunningTime="2026-04-16 16:49:04.880479995 +0000 UTC m=+70.021198737" watchObservedRunningTime="2026-04-16 16:49:04.882565918 +0000 UTC m=+70.023284660" Apr 16 16:49:04.894476 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:04.894439 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x6gbd" podStartSLOduration=67.683889888 podStartE2EDuration="1m9.894428812s" podCreationTimestamp="2026-04-16 16:47:55 +0000 UTC" firstStartedPulling="2026-04-16 16:49:01.586320261 +0000 UTC m=+66.727038982" lastFinishedPulling="2026-04-16 16:49:03.796859186 +0000 UTC m=+68.937577906" observedRunningTime="2026-04-16 16:49:04.894390176 +0000 UTC m=+70.035108908" watchObservedRunningTime="2026-04-16 16:49:04.894428812 +0000 UTC m=+70.035147554" Apr 16 16:49:05.056573 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.056542 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jgsfb" Apr 16 16:49:05.211001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.210915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:49:05.211001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.210955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:05.211001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.210985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:05.211691 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.211666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/682aafdb-c596-4a7b-8112-c6c867ff770e-service-ca-bundle\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:05.213527 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.213495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/682aafdb-c596-4a7b-8112-c6c867ff770e-metrics-certs\") pod \"router-default-5887855b54-djpw5\" (UID: \"682aafdb-c596-4a7b-8112-c6c867ff770e\") " pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:05.213832 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.213812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8af394bc-025e-4545-801f-0e6309febaa3-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-xr578\" (UID: \"8af394bc-025e-4545-801f-0e6309febaa3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:49:05.312263 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.312232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:49:05.312407 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.312309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:49:05.314490 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.314462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/363c07d0-bf5c-4368-a3fe-6d5136c2cd22-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-9v9hx\" (UID: \"363c07d0-bf5c-4368-a3fe-6d5136c2cd22\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:49:05.314586 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.314503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc7dff0-16af-4031-a829-4427a2699284-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xn576\" (UID: \"7bc7dff0-16af-4031-a829-4427a2699284\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:49:05.368260 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.368237 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wk772\"" Apr 16 16:49:05.376497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.376483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" Apr 16 16:49:05.389184 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.389162 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-q249j\"" Apr 16 16:49:05.396585 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.396565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:05.478471 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.478441 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hxttb\"" Apr 16 16:49:05.486812 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.486788 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" Apr 16 16:49:05.494114 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.494094 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rjlj5\"" Apr 16 16:49:05.502060 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.502026 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578"] Apr 16 16:49:05.502060 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.502025 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" Apr 16 16:49:05.550592 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.550263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5887855b54-djpw5"] Apr 16 16:49:05.557315 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:05.557279 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod682aafdb_c596_4a7b_8112_c6c867ff770e.slice/crio-c205f99325a23ee52ce875d97e5cbe190c94dcd640cf6b2e439a0fdebfc08a9c WatchSource:0}: Error finding container c205f99325a23ee52ce875d97e5cbe190c94dcd640cf6b2e439a0fdebfc08a9c: Status 404 returned error can't find the container with id c205f99325a23ee52ce875d97e5cbe190c94dcd640cf6b2e439a0fdebfc08a9c Apr 16 16:49:05.628162 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.628140 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx"] Apr 16 16:49:05.630279 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:05.630248 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod363c07d0_bf5c_4368_a3fe_6d5136c2cd22.slice/crio-96a12facd8cd495db396cbaa0a022e88bd1d65be1740676d9e967b3ef54c3db4 WatchSource:0}: Error finding container 96a12facd8cd495db396cbaa0a022e88bd1d65be1740676d9e967b3ef54c3db4: Status 404 returned error can't find the container with id 96a12facd8cd495db396cbaa0a022e88bd1d65be1740676d9e967b3ef54c3db4 Apr 16 16:49:05.653192 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.653173 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576"] Apr 16 16:49:05.655826 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:05.655802 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bc7dff0_16af_4031_a829_4427a2699284.slice/crio-03a2746bcf224f3bc9871667c20e7d3879e0840b6f83589a721bc8d53aa37ebc WatchSource:0}: Error finding container 03a2746bcf224f3bc9871667c20e7d3879e0840b6f83589a721bc8d53aa37ebc: Status 404 returned error can't find the container with id 03a2746bcf224f3bc9871667c20e7d3879e0840b6f83589a721bc8d53aa37ebc Apr 16 16:49:05.865172 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.865136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5887855b54-djpw5" event={"ID":"682aafdb-c596-4a7b-8112-c6c867ff770e","Type":"ContainerStarted","Data":"86da921780071c808be10e602fb69f81566d4a5f00ae504416963c7cda148d8a"} Apr 16 16:49:05.865172 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.865174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5887855b54-djpw5" event={"ID":"682aafdb-c596-4a7b-8112-c6c867ff770e","Type":"ContainerStarted","Data":"c205f99325a23ee52ce875d97e5cbe190c94dcd640cf6b2e439a0fdebfc08a9c"} Apr 16 16:49:05.866268 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.866243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" event={"ID":"8af394bc-025e-4545-801f-0e6309febaa3","Type":"ContainerStarted","Data":"8fc41fdf87d1b02590e8e7592cba6fdc7813ede79cc61d2f5c78afaa41b6f8c8"} Apr 16 16:49:05.867298 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.867278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" event={"ID":"363c07d0-bf5c-4368-a3fe-6d5136c2cd22","Type":"ContainerStarted","Data":"96a12facd8cd495db396cbaa0a022e88bd1d65be1740676d9e967b3ef54c3db4"} Apr 16 16:49:05.868216 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:05.868192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" event={"ID":"7bc7dff0-16af-4031-a829-4427a2699284","Type":"ContainerStarted","Data":"03a2746bcf224f3bc9871667c20e7d3879e0840b6f83589a721bc8d53aa37ebc"} Apr 16 16:49:06.396797 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:06.396761 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:06.399663 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:06.399635 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:06.420179 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:06.420138 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5887855b54-djpw5" podStartSLOduration=33.420121982 podStartE2EDuration="33.420121982s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:49:05.911399654 +0000 UTC m=+71.052118397" watchObservedRunningTime="2026-04-16 16:49:06.420121982 +0000 UTC m=+71.560840724" Apr 16 16:49:06.871910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:06.871881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:06.873172 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:06.873152 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5887855b54-djpw5" Apr 16 16:49:09.767214 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.767186 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-md7k7" Apr 16 16:49:09.772537 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.772510 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-r6bck"] Apr 16 16:49:09.793831 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.793805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6bck"] Apr 16 16:49:09.793974 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.793945 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:09.797262 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.797246 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:49:09.797499 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.797482 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:49:09.797584 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.797523 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jjdf6\"" Apr 16 16:49:09.881944 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.881914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" event={"ID":"8af394bc-025e-4545-801f-0e6309febaa3","Type":"ContainerStarted","Data":"f6281ecf6d2744bfd9ab79b246a0cf8848d9246942e466c0ef628c6ecab33a81"} Apr 16 16:49:09.882087 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.881952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" event={"ID":"8af394bc-025e-4545-801f-0e6309febaa3","Type":"ContainerStarted","Data":"0b2b373aba3d84b3aee475cef405681ce8e8638b6814ecb6866fbda7f0bee38e"} Apr 16 16:49:09.883324 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.883297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" event={"ID":"363c07d0-bf5c-4368-a3fe-6d5136c2cd22","Type":"ContainerStarted","Data":"f1702ae7bcf5c4c46fe05d6a723f6f21cfd3ec241119270de2f56b400769fc24"} Apr 16 16:49:09.884826 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.884797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" event={"ID":"7bc7dff0-16af-4031-a829-4427a2699284","Type":"ContainerStarted","Data":"e95df2b49cb060f05e4cdb4c17930e53d0b2bba8df7d92ddef66b2efd252bdbc"} Apr 16 16:49:09.902396 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.902341 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-xr578" podStartSLOduration=33.566231385 podStartE2EDuration="36.902328948s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:49:05.591064379 +0000 UTC m=+70.731783100" lastFinishedPulling="2026-04-16 16:49:08.927161943 +0000 UTC m=+74.067880663" observedRunningTime="2026-04-16 16:49:09.900822321 +0000 UTC m=+75.041541075" watchObservedRunningTime="2026-04-16 16:49:09.902328948 +0000 UTC m=+75.043047690" Apr 16 16:49:09.920176 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.920138 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xn576" podStartSLOduration=33.647382717 podStartE2EDuration="36.920128523s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:49:05.657547039 +0000 UTC m=+70.798265759" lastFinishedPulling="2026-04-16 16:49:08.930292841 +0000 UTC m=+74.071011565" observedRunningTime="2026-04-16 16:49:09.919860042 +0000 UTC m=+75.060578786" watchObservedRunningTime="2026-04-16 16:49:09.920128523 +0000 UTC m=+75.060847265" Apr 16 16:49:09.938811 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.938773 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-9v9hx" podStartSLOduration=33.644090725 podStartE2EDuration="36.938760335s" podCreationTimestamp="2026-04-16 16:48:33 +0000 UTC" firstStartedPulling="2026-04-16 16:49:05.632488269 +0000 UTC m=+70.773206989" lastFinishedPulling="2026-04-16 16:49:08.92715788 +0000 UTC m=+74.067876599" observedRunningTime="2026-04-16 16:49:09.937688528 +0000 UTC m=+75.078407270" watchObservedRunningTime="2026-04-16 16:49:09.938760335 +0000 UTC m=+75.079479077" Apr 16 16:49:09.952672 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.952644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/861f1460-ddcc-410d-a721-789653443c7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:09.952772 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.952686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/861f1460-ddcc-410d-a721-789653443c7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:09.952772 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.952754 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/861f1460-ddcc-410d-a721-789653443c7b-data-volume\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:09.952882 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.952853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/861f1460-ddcc-410d-a721-789653443c7b-crio-socket\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:09.952882 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:09.952875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxtp\" (UniqueName: \"kubernetes.io/projected/861f1460-ddcc-410d-a721-789653443c7b-kube-api-access-tzxtp\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.053883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.053850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/861f1460-ddcc-410d-a721-789653443c7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.054038 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.053893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/861f1460-ddcc-410d-a721-789653443c7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.054038 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.053927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/861f1460-ddcc-410d-a721-789653443c7b-data-volume\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.054038 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.053993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/861f1460-ddcc-410d-a721-789653443c7b-crio-socket\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.054038 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.054016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxtp\" (UniqueName: \"kubernetes.io/projected/861f1460-ddcc-410d-a721-789653443c7b-kube-api-access-tzxtp\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.054365 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.054344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/861f1460-ddcc-410d-a721-789653443c7b-crio-socket\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.065837 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.065811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/861f1460-ddcc-410d-a721-789653443c7b-data-volume\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.065962 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.065887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/861f1460-ddcc-410d-a721-789653443c7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.065962 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.065897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/861f1460-ddcc-410d-a721-789653443c7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.066590 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.066570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxtp\" (UniqueName: \"kubernetes.io/projected/861f1460-ddcc-410d-a721-789653443c7b-kube-api-access-tzxtp\") pod \"insights-runtime-extractor-r6bck\" (UID: \"861f1460-ddcc-410d-a721-789653443c7b\") " pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.103522 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.103502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6bck" Apr 16 16:49:10.220802 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.220766 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6bck"] Apr 16 16:49:10.223539 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:10.223513 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861f1460_ddcc_410d_a721_789653443c7b.slice/crio-f1a74fd69b9ee1b5a1d742e92a73b916db3d0d0a8b2728a39672fec246ace22f WatchSource:0}: Error finding container f1a74fd69b9ee1b5a1d742e92a73b916db3d0d0a8b2728a39672fec246ace22f: Status 404 returned error can't find the container with id f1a74fd69b9ee1b5a1d742e92a73b916db3d0d0a8b2728a39672fec246ace22f Apr 16 16:49:10.894930 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.894889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6bck" event={"ID":"861f1460-ddcc-410d-a721-789653443c7b","Type":"ContainerStarted","Data":"20ce0e8f584d079f9eb4b3fbc50b75d1f5c37a6cc959f43cbf2048abb6785f93"} Apr 16 16:49:10.895357 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:10.894941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6bck" event={"ID":"861f1460-ddcc-410d-a721-789653443c7b","Type":"ContainerStarted","Data":"f1a74fd69b9ee1b5a1d742e92a73b916db3d0d0a8b2728a39672fec246ace22f"} Apr 16 16:49:11.900338 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:11.900300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6bck" event={"ID":"861f1460-ddcc-410d-a721-789653443c7b","Type":"ContainerStarted","Data":"d3d881bc252d4d628ec1ffcf71bd3f64e155491ff077b35fe6f048cb44332821"} Apr 16 16:49:13.862452 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:13.862426 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lhgcr" Apr 16 16:49:13.907566 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:13.907539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6bck" event={"ID":"861f1460-ddcc-410d-a721-789653443c7b","Type":"ContainerStarted","Data":"0c416e29926fcaab440a7f5469d183890e76d148806488f23fab22dc4dfec79f"} Apr 16 16:49:13.926100 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:13.926052 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-r6bck" podStartSLOduration=2.084589615 podStartE2EDuration="4.926033793s" podCreationTimestamp="2026-04-16 16:49:09 +0000 UTC" firstStartedPulling="2026-04-16 16:49:10.34611569 +0000 UTC m=+75.486834410" lastFinishedPulling="2026-04-16 16:49:13.187559867 +0000 UTC m=+78.328278588" observedRunningTime="2026-04-16 16:49:13.925179315 +0000 UTC m=+79.065898059" watchObservedRunningTime="2026-04-16 16:49:13.926033793 +0000 UTC m=+79.066752536" Apr 16 16:49:20.011678 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.011647 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wbt4v"] Apr 16 16:49:20.017628 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.017608 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.021173 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.021151 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:49:20.021490 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.021204 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lxqh8\"" Apr 16 16:49:20.021750 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.021226 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:49:20.022372 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.022356 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:49:20.022644 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.022363 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:49:20.029151 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.029127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-root\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.029423 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.029407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-wtmp\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.029619 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.029604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.029794 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.029774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-metrics-client-ca\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.029939 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.029923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.030053 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.030040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-textfile\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.030158 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.030143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptdw\" (UniqueName: \"kubernetes.io/projected/1fc72970-3e8e-4014-9362-eadf182a5df0-kube-api-access-7ptdw\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.030265 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.030251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.030438 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.030416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-sys\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.130915 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.130881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131166 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-textfile\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131277 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptdw\" (UniqueName: \"kubernetes.io/projected/1fc72970-3e8e-4014-9362-eadf182a5df0-kube-api-access-7ptdw\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131404 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131535 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-sys\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131641 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-root\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131761 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-wtmp\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131879 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-sys\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.131998 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-metrics-client-ca\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.132047 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.131985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-root\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.132101 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.132043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-wtmp\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.132101 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:49:20.132074 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:49:20.132186 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:49:20.132165 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls podName:1fc72970-3e8e-4014-9362-eadf182a5df0 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.63214383 +0000 UTC m=+85.772862553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls") pod "node-exporter-wbt4v" (UID: "1fc72970-3e8e-4014-9362-eadf182a5df0") : secret "node-exporter-tls" not found Apr 16 16:49:20.132742 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.132681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-metrics-client-ca\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.133050 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.133024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-textfile\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.133233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.133215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.134497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.134477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.141337 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.141317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptdw\" (UniqueName: \"kubernetes.io/projected/1fc72970-3e8e-4014-9362-eadf182a5df0-kube-api-access-7ptdw\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.358049 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.358017 2573 patch_prober.go:28] interesting pod/image-registry-558df76499-sw5fx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:49:20.358189 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.358067 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-558df76499-sw5fx" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:49:20.638222 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.638147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.640461 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.640432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fc72970-3e8e-4014-9362-eadf182a5df0-node-exporter-tls\") pod \"node-exporter-wbt4v\" (UID: \"1fc72970-3e8e-4014-9362-eadf182a5df0\") " pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.931067 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:20.930993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wbt4v" Apr 16 16:49:20.940670 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:49:20.940631 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc72970_3e8e_4014_9362_eadf182a5df0.slice/crio-7a1348ea8906b1b57691daa640aa6b2bcc6dcf2a7ba63f39fd70410d5cce01d6 WatchSource:0}: Error finding container 7a1348ea8906b1b57691daa640aa6b2bcc6dcf2a7ba63f39fd70410d5cce01d6: Status 404 returned error can't find the container with id 7a1348ea8906b1b57691daa640aa6b2bcc6dcf2a7ba63f39fd70410d5cce01d6 Apr 16 16:49:21.848033 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:21.847969 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:21.931277 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:21.931250 2573 generic.go:358] "Generic (PLEG): container finished" podID="1fc72970-3e8e-4014-9362-eadf182a5df0" containerID="a3a6c1c0aaedafc93d13ebc117837813a7cbdf1ca6c9f15532df41ff66f63d53" exitCode=0 Apr 16 16:49:21.931439 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:21.931320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wbt4v" event={"ID":"1fc72970-3e8e-4014-9362-eadf182a5df0","Type":"ContainerDied","Data":"a3a6c1c0aaedafc93d13ebc117837813a7cbdf1ca6c9f15532df41ff66f63d53"} Apr 16 16:49:21.931439 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:21.931356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wbt4v" event={"ID":"1fc72970-3e8e-4014-9362-eadf182a5df0","Type":"ContainerStarted","Data":"7a1348ea8906b1b57691daa640aa6b2bcc6dcf2a7ba63f39fd70410d5cce01d6"} Apr 16 16:49:22.936628 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:22.936597 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wbt4v" event={"ID":"1fc72970-3e8e-4014-9362-eadf182a5df0","Type":"ContainerStarted","Data":"00a43fe4d5d84b41e477f43db6e226e8f2e89038bea0a58913f71d508114ff50"} Apr 16 16:49:22.936628 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:22.936632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wbt4v" event={"ID":"1fc72970-3e8e-4014-9362-eadf182a5df0","Type":"ContainerStarted","Data":"81eff46f91bdccf58d15a7e5744b03dd31382d8386e2eca116cf719050b352f8"} Apr 16 16:49:22.973677 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:22.973628 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wbt4v" podStartSLOduration=3.335912742 podStartE2EDuration="3.973613492s" podCreationTimestamp="2026-04-16 16:49:19 +0000 UTC" firstStartedPulling="2026-04-16 16:49:20.942828167 +0000 UTC m=+86.083546886" lastFinishedPulling="2026-04-16 16:49:21.580528913 +0000 UTC m=+86.721247636" observedRunningTime="2026-04-16 16:49:22.972603216 +0000 UTC m=+88.113321951" watchObservedRunningTime="2026-04-16 16:49:22.973613492 +0000 UTC m=+88.114332233" Apr 16 16:49:33.006995 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:33.006956 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:49:57.036538 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:57.036502 2573 generic.go:358] "Generic (PLEG): container finished" podID="4b32d91f-2c9d-4d71-b910-066e212015e3" containerID="39ee3dd74a37f13140090dcc159208e056ff3eace72f9a0a82e5636090c50654" exitCode=0 Apr 16 16:49:57.036930 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:57.036548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" event={"ID":"4b32d91f-2c9d-4d71-b910-066e212015e3","Type":"ContainerDied","Data":"39ee3dd74a37f13140090dcc159208e056ff3eace72f9a0a82e5636090c50654"} Apr 16 16:49:57.036930 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:57.036847 2573 scope.go:117] "RemoveContainer" containerID="39ee3dd74a37f13140090dcc159208e056ff3eace72f9a0a82e5636090c50654" Apr 16 16:49:58.026115 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.026079 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-558df76499-sw5fx" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerName="registry" containerID="cri-o://ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269" gracePeriod=30 Apr 16 16:49:58.040908 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.040883 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-7w64z" event={"ID":"4b32d91f-2c9d-4d71-b910-066e212015e3","Type":"ContainerStarted","Data":"317146b0748f0ffb618e3c9c1b14b311e53869ae916faf6025ee5a13fa8f6b85"} Apr 16 16:49:58.255246 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.255225 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:58.306657 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306632 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5g82\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306758 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306667 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306758 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306758 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306726 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306758 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306744 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306957 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306770 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306957 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306801 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.306957 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.306820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") pod \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\" (UID: \"d030f4e9-69ab-40db-8fd6-d2f53d467cdb\") " Apr 16 16:49:58.307793 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.307286 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:58.307793 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.307313 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:49:58.309344 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.309320 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:58.309504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.309321 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82" (OuterVolumeSpecName: "kube-api-access-x5g82") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "kube-api-access-x5g82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:58.309504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.309394 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:58.309504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.309373 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:49:58.309504 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.309467 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:49:58.314986 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.314965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d030f4e9-69ab-40db-8fd6-d2f53d467cdb" (UID: "d030f4e9-69ab-40db-8fd6-d2f53d467cdb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:58.407555 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407533 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-certificates\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407555 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407554 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-registry-tls\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407564 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5g82\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-kube-api-access-x5g82\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407575 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-image-registry-private-configuration\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407585 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-ca-trust-extracted\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407594 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-trusted-ca\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407603 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-installation-pull-secrets\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.407688 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.407613 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d030f4e9-69ab-40db-8fd6-d2f53d467cdb-bound-sa-token\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:49:58.561189 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:58.561129 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" podUID="536018da-d22a-4d5a-a54d-4d2116c68151" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:49:59.045910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.045881 2573 generic.go:358] "Generic (PLEG): container finished" podID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerID="ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269" exitCode=0 Apr 16 16:49:59.046267 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.045932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558df76499-sw5fx" event={"ID":"d030f4e9-69ab-40db-8fd6-d2f53d467cdb","Type":"ContainerDied","Data":"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269"} Apr 16 16:49:59.046267 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.045949 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558df76499-sw5fx" Apr 16 16:49:59.046267 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.045962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558df76499-sw5fx" event={"ID":"d030f4e9-69ab-40db-8fd6-d2f53d467cdb","Type":"ContainerDied","Data":"f8b9670725b26fab7121bfc542751791e95756bb2fa6bfb01bfc4f60f9492679"} Apr 16 16:49:59.046267 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.045983 2573 scope.go:117] "RemoveContainer" containerID="ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269" Apr 16 16:49:59.056211 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.056194 2573 scope.go:117] "RemoveContainer" containerID="ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269" Apr 16 16:49:59.056522 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:49:59.056500 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269\": container with ID starting with ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269 not found: ID does not exist" containerID="ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269" Apr 16 16:49:59.056576 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.056532 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269"} err="failed to get container status \"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269\": rpc error: code = NotFound desc = could not find container \"ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269\": container with ID starting with ba68d95393aa1a593c2f0b438adfb76cd0e0c26521d7c50db90fdb0e90af6269 not found: ID does not exist" Apr 16 16:49:59.069130 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.069098 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:49:59.073203 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.073179 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-558df76499-sw5fx"] Apr 16 16:49:59.421475 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:49:59.421418 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" path="/var/lib/kubelet/pods/d030f4e9-69ab-40db-8fd6-d2f53d467cdb/volumes" Apr 16 16:50:08.560787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:08.560751 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" podUID="536018da-d22a-4d5a-a54d-4d2116c68151" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:50:12.091715 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:12.091668 2573 generic.go:358] "Generic (PLEG): container finished" podID="ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2" containerID="abadd7c229b0d765f3a8054c53964b693d8a58011846c784178f09d8877efb03" exitCode=0 Apr 16 16:50:12.092159 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:12.091747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" event={"ID":"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2","Type":"ContainerDied","Data":"abadd7c229b0d765f3a8054c53964b693d8a58011846c784178f09d8877efb03"} Apr 16 16:50:12.092159 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:12.092078 2573 scope.go:117] "RemoveContainer" containerID="abadd7c229b0d765f3a8054c53964b693d8a58011846c784178f09d8877efb03" Apr 16 16:50:13.096748 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:13.096710 2573 generic.go:358] "Generic (PLEG): container finished" podID="2b75e55f-5bdd-4cbb-abd0-69be2a62852e" containerID="43d9b63ef5f4f55f1909d8cb9e0976e537cf9d7feaf9ac6c48d640644787c783" exitCode=0 Apr 16 16:50:13.097192 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:13.096787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" event={"ID":"2b75e55f-5bdd-4cbb-abd0-69be2a62852e","Type":"ContainerDied","Data":"43d9b63ef5f4f55f1909d8cb9e0976e537cf9d7feaf9ac6c48d640644787c783"} Apr 16 16:50:13.097192 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:13.097131 2573 scope.go:117] "RemoveContainer" containerID="43d9b63ef5f4f55f1909d8cb9e0976e537cf9d7feaf9ac6c48d640644787c783" Apr 16 16:50:13.099015 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:13.098964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-x9csm" event={"ID":"ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2","Type":"ContainerStarted","Data":"e41ff0ae0e6f8dea42b5194c8cc8b14222f81cf97a6f73b71dc8f685a8a2ea64"} Apr 16 16:50:14.103876 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:14.103841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7wg74" event={"ID":"2b75e55f-5bdd-4cbb-abd0-69be2a62852e","Type":"ContainerStarted","Data":"a5a86f6ce8dd1e856ef147b1b48ff916be20a549fd219cac09a23b094268e3a6"} Apr 16 16:50:18.560730 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:18.560673 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" podUID="536018da-d22a-4d5a-a54d-4d2116c68151" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:50:18.561181 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:18.560776 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" Apr 16 16:50:18.561272 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:18.561250 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"1d2dfc69e221ffc42e33433e95184da0c0e21d6fcf12c3a5b32dc425e8f7192c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:50:18.561312 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:18.561299 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" podUID="536018da-d22a-4d5a-a54d-4d2116c68151" containerName="service-proxy" containerID="cri-o://1d2dfc69e221ffc42e33433e95184da0c0e21d6fcf12c3a5b32dc425e8f7192c" gracePeriod=30 Apr 16 16:50:19.125443 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:19.125411 2573 generic.go:358] "Generic (PLEG): container finished" podID="536018da-d22a-4d5a-a54d-4d2116c68151" containerID="1d2dfc69e221ffc42e33433e95184da0c0e21d6fcf12c3a5b32dc425e8f7192c" exitCode=2 Apr 16 16:50:19.125593 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:19.125494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerDied","Data":"1d2dfc69e221ffc42e33433e95184da0c0e21d6fcf12c3a5b32dc425e8f7192c"} Apr 16 16:50:19.125593 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:50:19.125542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8596fb7f8-nnmch" event={"ID":"536018da-d22a-4d5a-a54d-4d2116c68151","Type":"ContainerStarted","Data":"0732d20231e830b2d0c3ad43d71e324dea5a6f0c1fd941b4bcede5f128f505be"} Apr 16 16:52:55.325075 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:52:55.325046 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:52:55.325554 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:52:55.325215 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:52:55.335208 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:52:55.335187 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:53:41.074638 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.074554 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c"] Apr 16 16:53:41.075044 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.074822 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerName="registry" Apr 16 16:53:41.075044 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.074845 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerName="registry" Apr 16 16:53:41.075044 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.074908 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d030f4e9-69ab-40db-8fd6-d2f53d467cdb" containerName="registry" Apr 16 16:53:41.076821 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.076802 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.079788 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.079763 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:53:41.080130 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.080115 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:53:41.081338 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.081315 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:53:41.081444 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.081337 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:53:41.081444 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.081357 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-cj4jz\"" Apr 16 16:53:41.081531 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.081454 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:53:41.087111 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.087088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c"] Apr 16 16:53:41.208189 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.208150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f5c241d-078c-45ed-a269-efb2dd6acc38-manager-config\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.208189 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.208195 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.208454 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.208215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ssrm\" (UniqueName: \"kubernetes.io/projected/9f5c241d-078c-45ed-a269-efb2dd6acc38-kube-api-access-2ssrm\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.208454 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.208299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.308749 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.308706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f5c241d-078c-45ed-a269-efb2dd6acc38-manager-config\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.308749 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.308757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.308959 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.308775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ssrm\" (UniqueName: \"kubernetes.io/projected/9f5c241d-078c-45ed-a269-efb2dd6acc38-kube-api-access-2ssrm\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.308959 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.308809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.309551 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.309525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f5c241d-078c-45ed-a269-efb2dd6acc38-manager-config\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.311233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.311202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.311342 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.311281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f5c241d-078c-45ed-a269-efb2dd6acc38-cert\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.322096 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.322069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ssrm\" (UniqueName: \"kubernetes.io/projected/9f5c241d-078c-45ed-a269-efb2dd6acc38-kube-api-access-2ssrm\") pod \"lws-controller-manager-65f5d85b79-ppb2c\" (UID: \"9f5c241d-078c-45ed-a269-efb2dd6acc38\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.386337 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.386243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:41.524779 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.524756 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c"] Apr 16 16:53:41.527190 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:53:41.527152 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5c241d_078c_45ed_a269_efb2dd6acc38.slice/crio-5907d9dd9ac7e3685272820e03e69c940357954ba7a1db4749197ed83f033749 WatchSource:0}: Error finding container 5907d9dd9ac7e3685272820e03e69c940357954ba7a1db4749197ed83f033749: Status 404 returned error can't find the container with id 5907d9dd9ac7e3685272820e03e69c940357954ba7a1db4749197ed83f033749 Apr 16 16:53:41.528813 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.528798 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:53:41.695052 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:41.694971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" event={"ID":"9f5c241d-078c-45ed-a269-efb2dd6acc38","Type":"ContainerStarted","Data":"5907d9dd9ac7e3685272820e03e69c940357954ba7a1db4749197ed83f033749"} Apr 16 16:53:44.706727 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:44.706690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" event={"ID":"9f5c241d-078c-45ed-a269-efb2dd6acc38","Type":"ContainerStarted","Data":"405e7088b651235038fafba899f352c96942b6475afce27e761b39ca0517d7a0"} Apr 16 16:53:44.707125 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:44.706741 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:53:44.723904 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:44.723850 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" podStartSLOduration=1.522452111 podStartE2EDuration="3.723834215s" podCreationTimestamp="2026-04-16 16:53:41 +0000 UTC" firstStartedPulling="2026-04-16 16:53:41.528925971 +0000 UTC m=+346.669644690" lastFinishedPulling="2026-04-16 16:53:43.73030807 +0000 UTC m=+348.871026794" observedRunningTime="2026-04-16 16:53:44.723615962 +0000 UTC m=+349.864334706" watchObservedRunningTime="2026-04-16 16:53:44.723834215 +0000 UTC m=+349.864552957" Apr 16 16:53:55.711115 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:53:55.711088 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-ppb2c" Apr 16 16:54:29.895203 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.895169 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw"] Apr 16 16:54:29.899473 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.899453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:29.902760 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.902737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 16:54:29.902866 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.902807 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rzzrl\"" Apr 16 16:54:29.903065 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.903049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:54:29.903151 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.903069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 16:54:29.904799 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.904783 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:54:29.913493 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.913474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw"] Apr 16 16:54:29.939276 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.939255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/169c36ca-e161-40b0-9d76-166f4626fa3e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:29.939398 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.939300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b682\" (UniqueName: \"kubernetes.io/projected/169c36ca-e161-40b0-9d76-166f4626fa3e-kube-api-access-2b682\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:29.939483 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:29.939433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.040435 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.040412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b682\" (UniqueName: \"kubernetes.io/projected/169c36ca-e161-40b0-9d76-166f4626fa3e-kube-api-access-2b682\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.040517 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.040465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.040557 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:54:30.040538 2573 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 16:54:30.040600 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:54:30.040589 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert podName:169c36ca-e161-40b0-9d76-166f4626fa3e nodeName:}" failed. No retries permitted until 2026-04-16 16:54:30.540574679 +0000 UTC m=+395.681293400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-d76cw" (UID: "169c36ca-e161-40b0-9d76-166f4626fa3e") : secret "plugin-serving-cert" not found Apr 16 16:54:30.040659 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.040639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/169c36ca-e161-40b0-9d76-166f4626fa3e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.041150 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.041134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/169c36ca-e161-40b0-9d76-166f4626fa3e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.051785 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.051757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b682\" (UniqueName: \"kubernetes.io/projected/169c36ca-e161-40b0-9d76-166f4626fa3e-kube-api-access-2b682\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.543810 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.543779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.546193 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.546162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/169c36ca-e161-40b0-9d76-166f4626fa3e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-d76cw\" (UID: \"169c36ca-e161-40b0-9d76-166f4626fa3e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.808018 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.807956 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" Apr 16 16:54:30.936619 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:30.936591 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw"] Apr 16 16:54:30.939423 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:54:30.939395 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod169c36ca_e161_40b0_9d76_166f4626fa3e.slice/crio-d5be8b6f6540dbb63ab576429313097fa1c87dde609947f07c1696e3644c022e WatchSource:0}: Error finding container d5be8b6f6540dbb63ab576429313097fa1c87dde609947f07c1696e3644c022e: Status 404 returned error can't find the container with id d5be8b6f6540dbb63ab576429313097fa1c87dde609947f07c1696e3644c022e Apr 16 16:54:31.847763 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:31.847722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" event={"ID":"169c36ca-e161-40b0-9d76-166f4626fa3e","Type":"ContainerStarted","Data":"d5be8b6f6540dbb63ab576429313097fa1c87dde609947f07c1696e3644c022e"} Apr 16 16:54:45.894954 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:45.894919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" event={"ID":"169c36ca-e161-40b0-9d76-166f4626fa3e","Type":"ContainerStarted","Data":"33cf8b09039b4779d35cea076142e44dec05c6d158b72a36902f4c99d708002f"} Apr 16 16:54:45.912818 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:54:45.912759 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-d76cw" podStartSLOduration=2.604377124 podStartE2EDuration="16.912744169s" podCreationTimestamp="2026-04-16 16:54:29 +0000 UTC" firstStartedPulling="2026-04-16 16:54:30.940912589 +0000 UTC m=+396.081631311" lastFinishedPulling="2026-04-16 16:54:45.249279636 +0000 UTC m=+410.389998356" observedRunningTime="2026-04-16 16:54:45.911077021 +0000 UTC m=+411.051795767" watchObservedRunningTime="2026-04-16 16:54:45.912744169 +0000 UTC m=+411.053462910" Apr 16 16:55:13.568780 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.568740 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:13.593358 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.593327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:13.593525 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.593446 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:13.596002 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.595976 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-pw59z\"" Apr 16 16:55:13.650741 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.650712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzsw\" (UniqueName: \"kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw\") pod \"authorino-674b59b84c-6kvzj\" (UID: \"6a21ae8e-27ab-451e-93d9-f02e5188a5b6\") " pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:13.743820 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.743794 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:13.746946 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.746931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:13.751880 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.751856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzsw\" (UniqueName: \"kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw\") pod \"authorino-674b59b84c-6kvzj\" (UID: \"6a21ae8e-27ab-451e-93d9-f02e5188a5b6\") " pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:13.754108 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.754085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:13.763561 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.763541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzsw\" (UniqueName: \"kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw\") pod \"authorino-674b59b84c-6kvzj\" (UID: \"6a21ae8e-27ab-451e-93d9-f02e5188a5b6\") " pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:13.853049 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.852995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v778v\" (UniqueName: \"kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v\") pod \"authorino-79cbc94b89-64sf6\" (UID: \"1b6705e9-a706-494b-8d14-3474f7886a97\") " pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:13.903881 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.903858 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:13.953946 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.953916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v778v\" (UniqueName: \"kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v\") pod \"authorino-79cbc94b89-64sf6\" (UID: \"1b6705e9-a706-494b-8d14-3474f7886a97\") " pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:13.963129 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:13.963106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v778v\" (UniqueName: \"kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v\") pod \"authorino-79cbc94b89-64sf6\" (UID: \"1b6705e9-a706-494b-8d14-3474f7886a97\") " pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:14.021668 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:14.021639 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:14.024561 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:55:14.024530 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a21ae8e_27ab_451e_93d9_f02e5188a5b6.slice/crio-9b795823bf3199f83ed9f4f21debda46bd5aa885a45243f1b1004fc709222532 WatchSource:0}: Error finding container 9b795823bf3199f83ed9f4f21debda46bd5aa885a45243f1b1004fc709222532: Status 404 returned error can't find the container with id 9b795823bf3199f83ed9f4f21debda46bd5aa885a45243f1b1004fc709222532 Apr 16 16:55:14.056720 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:14.056702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:14.183553 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:14.183512 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:14.186320 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:55:14.186296 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b6705e9_a706_494b_8d14_3474f7886a97.slice/crio-f2ea98b09072d153f91426c42a923054f4972351de8c746bb3bfa375b2f2107b WatchSource:0}: Error finding container f2ea98b09072d153f91426c42a923054f4972351de8c746bb3bfa375b2f2107b: Status 404 returned error can't find the container with id f2ea98b09072d153f91426c42a923054f4972351de8c746bb3bfa375b2f2107b Apr 16 16:55:14.985806 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:14.985734 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-64sf6" event={"ID":"1b6705e9-a706-494b-8d14-3474f7886a97","Type":"ContainerStarted","Data":"f2ea98b09072d153f91426c42a923054f4972351de8c746bb3bfa375b2f2107b"} Apr 16 16:55:14.987547 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:14.987513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-6kvzj" event={"ID":"6a21ae8e-27ab-451e-93d9-f02e5188a5b6","Type":"ContainerStarted","Data":"9b795823bf3199f83ed9f4f21debda46bd5aa885a45243f1b1004fc709222532"} Apr 16 16:55:16.995518 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:16.995416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-6kvzj" event={"ID":"6a21ae8e-27ab-451e-93d9-f02e5188a5b6","Type":"ContainerStarted","Data":"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1"} Apr 16 16:55:16.996874 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:16.996848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-64sf6" event={"ID":"1b6705e9-a706-494b-8d14-3474f7886a97","Type":"ContainerStarted","Data":"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62"} Apr 16 16:55:17.032590 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:17.032430 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-6kvzj" podStartSLOduration=1.439433153 podStartE2EDuration="4.032413101s" podCreationTimestamp="2026-04-16 16:55:13 +0000 UTC" firstStartedPulling="2026-04-16 16:55:14.025812204 +0000 UTC m=+439.166530924" lastFinishedPulling="2026-04-16 16:55:16.618792153 +0000 UTC m=+441.759510872" observedRunningTime="2026-04-16 16:55:17.013070338 +0000 UTC m=+442.153789079" watchObservedRunningTime="2026-04-16 16:55:17.032413101 +0000 UTC m=+442.173131845" Apr 16 16:55:17.032722 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:17.032592 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-64sf6" podStartSLOduration=1.591523558 podStartE2EDuration="4.032584944s" podCreationTimestamp="2026-04-16 16:55:13 +0000 UTC" firstStartedPulling="2026-04-16 16:55:14.187672792 +0000 UTC m=+439.328391516" lastFinishedPulling="2026-04-16 16:55:16.628734175 +0000 UTC m=+441.769452902" observedRunningTime="2026-04-16 16:55:17.03147326 +0000 UTC m=+442.172192005" watchObservedRunningTime="2026-04-16 16:55:17.032584944 +0000 UTC m=+442.173303700" Apr 16 16:55:17.071168 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:17.071143 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:19.002813 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:19.002757 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-6kvzj" podUID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" containerName="authorino" containerID="cri-o://6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1" gracePeriod=30 Apr 16 16:55:19.251293 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:19.251272 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:19.395542 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:19.395519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwzsw\" (UniqueName: \"kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw\") pod \"6a21ae8e-27ab-451e-93d9-f02e5188a5b6\" (UID: \"6a21ae8e-27ab-451e-93d9-f02e5188a5b6\") " Apr 16 16:55:19.397675 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:19.397641 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw" (OuterVolumeSpecName: "kube-api-access-nwzsw") pod "6a21ae8e-27ab-451e-93d9-f02e5188a5b6" (UID: "6a21ae8e-27ab-451e-93d9-f02e5188a5b6"). InnerVolumeSpecName "kube-api-access-nwzsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:19.495921 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:19.495893 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nwzsw\" (UniqueName: \"kubernetes.io/projected/6a21ae8e-27ab-451e-93d9-f02e5188a5b6-kube-api-access-nwzsw\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:55:20.006933 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.006898 2573 generic.go:358] "Generic (PLEG): container finished" podID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" containerID="6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1" exitCode=0 Apr 16 16:55:20.007305 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.006965 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-6kvzj" Apr 16 16:55:20.007305 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.006988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-6kvzj" event={"ID":"6a21ae8e-27ab-451e-93d9-f02e5188a5b6","Type":"ContainerDied","Data":"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1"} Apr 16 16:55:20.007305 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.007034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-6kvzj" event={"ID":"6a21ae8e-27ab-451e-93d9-f02e5188a5b6","Type":"ContainerDied","Data":"9b795823bf3199f83ed9f4f21debda46bd5aa885a45243f1b1004fc709222532"} Apr 16 16:55:20.007305 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.007052 2573 scope.go:117] "RemoveContainer" containerID="6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1" Apr 16 16:55:20.016236 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.016217 2573 scope.go:117] "RemoveContainer" containerID="6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1" Apr 16 16:55:20.016554 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:55:20.016527 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1\": container with ID starting with 6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1 not found: ID does not exist" containerID="6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1" Apr 16 16:55:20.016670 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.016561 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1"} err="failed to get container status \"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1\": rpc error: code = NotFound desc = could not find container \"6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1\": container with ID starting with 6eb386213a771c5aea784fb02980a680588964f0652ba55f0481b5f75f6b23f1 not found: ID does not exist" Apr 16 16:55:20.023641 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.023613 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:20.026874 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:20.026849 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-6kvzj"] Apr 16 16:55:21.421080 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:21.421038 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" path="/var/lib/kubelet/pods/6a21ae8e-27ab-451e-93d9-f02e5188a5b6/volumes" Apr 16 16:55:46.316913 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.316876 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-5ttpp"] Apr 16 16:55:46.317345 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.317176 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" containerName="authorino" Apr 16 16:55:46.317345 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.317188 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" containerName="authorino" Apr 16 16:55:46.317345 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.317256 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a21ae8e-27ab-451e-93d9-f02e5188a5b6" containerName="authorino" Apr 16 16:55:46.320174 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.320155 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.322864 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.322833 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 16:55:46.325733 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.325694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-5ttpp"] Apr 16 16:55:46.481586 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.481557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-725m5\" (UniqueName: \"kubernetes.io/projected/b3993f29-38ac-41b3-aed8-7010c1a5b79a-kube-api-access-725m5\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.481730 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.481635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3993f29-38ac-41b3-aed8-7010c1a5b79a-tls-cert\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.582233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.582169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3993f29-38ac-41b3-aed8-7010c1a5b79a-tls-cert\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.582233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.582204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-725m5\" (UniqueName: \"kubernetes.io/projected/b3993f29-38ac-41b3-aed8-7010c1a5b79a-kube-api-access-725m5\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.584857 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.584830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3993f29-38ac-41b3-aed8-7010c1a5b79a-tls-cert\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.590272 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.590246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-725m5\" (UniqueName: \"kubernetes.io/projected/b3993f29-38ac-41b3-aed8-7010c1a5b79a-kube-api-access-725m5\") pod \"authorino-68bd676465-5ttpp\" (UID: \"b3993f29-38ac-41b3-aed8-7010c1a5b79a\") " pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.631174 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.631140 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-5ttpp" Apr 16 16:55:46.754630 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:46.754602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-5ttpp"] Apr 16 16:55:46.757553 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:55:46.757522 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3993f29_38ac_41b3_aed8_7010c1a5b79a.slice/crio-8a69c189d9fa5d917ce21be68c8ee46498940639c4810d24cb632c93d19e23de WatchSource:0}: Error finding container 8a69c189d9fa5d917ce21be68c8ee46498940639c4810d24cb632c93d19e23de: Status 404 returned error can't find the container with id 8a69c189d9fa5d917ce21be68c8ee46498940639c4810d24cb632c93d19e23de Apr 16 16:55:47.093075 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:47.093046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-5ttpp" event={"ID":"b3993f29-38ac-41b3-aed8-7010c1a5b79a","Type":"ContainerStarted","Data":"8a69c189d9fa5d917ce21be68c8ee46498940639c4810d24cb632c93d19e23de"} Apr 16 16:55:48.097979 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.097940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-5ttpp" event={"ID":"b3993f29-38ac-41b3-aed8-7010c1a5b79a","Type":"ContainerStarted","Data":"6e7835b394e73354db8b4402cd3b0f6712e3ca74eaf36f3d395cfcf1833a395a"} Apr 16 16:55:48.141970 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.141908 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-5ttpp" podStartSLOduration=1.5242167260000001 podStartE2EDuration="2.14189285s" podCreationTimestamp="2026-04-16 16:55:46 +0000 UTC" firstStartedPulling="2026-04-16 16:55:46.758880348 +0000 UTC m=+471.899599068" lastFinishedPulling="2026-04-16 16:55:47.376556469 +0000 UTC m=+472.517275192" observedRunningTime="2026-04-16 16:55:48.113133511 +0000 UTC m=+473.253852253" watchObservedRunningTime="2026-04-16 16:55:48.14189285 +0000 UTC m=+473.282611589" Apr 16 16:55:48.142242 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.142222 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:48.142532 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.142508 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-64sf6" podUID="1b6705e9-a706-494b-8d14-3474f7886a97" containerName="authorino" containerID="cri-o://10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62" gracePeriod=30 Apr 16 16:55:48.394530 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.394508 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:48.498282 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.498259 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v778v\" (UniqueName: \"kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v\") pod \"1b6705e9-a706-494b-8d14-3474f7886a97\" (UID: \"1b6705e9-a706-494b-8d14-3474f7886a97\") " Apr 16 16:55:48.500305 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.500280 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v" (OuterVolumeSpecName: "kube-api-access-v778v") pod "1b6705e9-a706-494b-8d14-3474f7886a97" (UID: "1b6705e9-a706-494b-8d14-3474f7886a97"). InnerVolumeSpecName "kube-api-access-v778v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:48.598797 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:48.598772 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v778v\" (UniqueName: \"kubernetes.io/projected/1b6705e9-a706-494b-8d14-3474f7886a97-kube-api-access-v778v\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:55:49.102109 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.102068 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b6705e9-a706-494b-8d14-3474f7886a97" containerID="10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62" exitCode=0 Apr 16 16:55:49.102476 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.102126 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-64sf6" Apr 16 16:55:49.102476 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.102130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-64sf6" event={"ID":"1b6705e9-a706-494b-8d14-3474f7886a97","Type":"ContainerDied","Data":"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62"} Apr 16 16:55:49.102476 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.102174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-64sf6" event={"ID":"1b6705e9-a706-494b-8d14-3474f7886a97","Type":"ContainerDied","Data":"f2ea98b09072d153f91426c42a923054f4972351de8c746bb3bfa375b2f2107b"} Apr 16 16:55:49.102476 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.102192 2573 scope.go:117] "RemoveContainer" containerID="10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62" Apr 16 16:55:49.111351 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.111332 2573 scope.go:117] "RemoveContainer" containerID="10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62" Apr 16 16:55:49.111794 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:55:49.111764 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62\": container with ID starting with 10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62 not found: ID does not exist" containerID="10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62" Apr 16 16:55:49.111897 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.111790 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62"} err="failed to get container status \"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62\": rpc error: code = NotFound desc = could not find container \"10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62\": container with ID starting with 10d4c73a0eecebbaee84c1acf17c62f18b178e6dda33dd7297298aba8cfcbb62 not found: ID does not exist" Apr 16 16:55:49.123795 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.123771 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:49.126658 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.126637 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-64sf6"] Apr 16 16:55:49.428068 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:55:49.428000 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6705e9-a706-494b-8d14-3474f7886a97" path="/var/lib/kubelet/pods/1b6705e9-a706-494b-8d14-3474f7886a97/volumes" Apr 16 16:57:53.852756 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.852715 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6"] Apr 16 16:57:53.853185 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.853151 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b6705e9-a706-494b-8d14-3474f7886a97" containerName="authorino" Apr 16 16:57:53.853185 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.853171 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6705e9-a706-494b-8d14-3474f7886a97" containerName="authorino" Apr 16 16:57:53.853273 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.853257 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b6705e9-a706-494b-8d14-3474f7886a97" containerName="authorino" Apr 16 16:57:53.856356 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.856336 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.859186 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.859160 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 16:57:53.859186 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.859187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:57:53.859422 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.859209 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-mqhwd\"" Apr 16 16:57:53.859422 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.859229 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:57:53.864000 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.863973 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6"] Apr 16 16:57:53.976482 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzfw6\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-kube-api-access-lzfw6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976585 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976585 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976659 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976659 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976659 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976659 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976803 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/60778c40-92ca-4f0b-9544-83d16c23c3a9-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:53.976803 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:53.976711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077773 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.077883 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/60778c40-92ca-4f0b-9544-83d16c23c3a9-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078133 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078133 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.077969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzfw6\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-kube-api-access-lzfw6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078133 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078296 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078149 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078296 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078369 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078493 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.078777 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.078749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/60778c40-92ca-4f0b-9544-83d16c23c3a9-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.080121 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.080102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.080484 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.080464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.088007 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.087981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzfw6\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-kube-api-access-lzfw6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.088245 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.088227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/60778c40-92ca-4f0b-9544-83d16c23c3a9-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l24v6\" (UID: \"60778c40-92ca-4f0b-9544-83d16c23c3a9\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.168258 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.168214 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:54.289151 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.289122 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6"] Apr 16 16:57:54.292137 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:57:54.292110 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60778c40_92ca_4f0b_9544_83d16c23c3a9.slice/crio-7f88c5c0341014ff246473a4b3cccd0feca9742ae0ac2a899e251d8f259a51fc WatchSource:0}: Error finding container 7f88c5c0341014ff246473a4b3cccd0feca9742ae0ac2a899e251d8f259a51fc: Status 404 returned error can't find the container with id 7f88c5c0341014ff246473a4b3cccd0feca9742ae0ac2a899e251d8f259a51fc Apr 16 16:57:54.475648 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:54.475573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" event={"ID":"60778c40-92ca-4f0b-9544-83d16c23c3a9","Type":"ContainerStarted","Data":"7f88c5c0341014ff246473a4b3cccd0feca9742ae0ac2a899e251d8f259a51fc"} Apr 16 16:57:55.347857 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:55.347829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:57:55.349065 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:55.349042 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 16:57:56.916762 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:56.916718 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:57:56.917035 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:56.916806 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:57:56.917035 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:56.916852 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:57:57.486198 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:57.486157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" event={"ID":"60778c40-92ca-4f0b-9544-83d16c23c3a9","Type":"ContainerStarted","Data":"bfe6b04490f62dabdb80fef408e736fc5431aa3972ab6c17126ab84994fa6b1f"} Apr 16 16:57:57.505179 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:57.505131 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" podStartSLOduration=1.8824802040000002 podStartE2EDuration="4.50511496s" podCreationTimestamp="2026-04-16 16:57:53 +0000 UTC" firstStartedPulling="2026-04-16 16:57:54.293872516 +0000 UTC m=+599.434591235" lastFinishedPulling="2026-04-16 16:57:56.916507266 +0000 UTC m=+602.057225991" observedRunningTime="2026-04-16 16:57:57.503760047 +0000 UTC m=+602.644478784" watchObservedRunningTime="2026-04-16 16:57:57.50511496 +0000 UTC m=+602.645833701" Apr 16 16:57:58.168971 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:58.168935 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:58.173750 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:58.173725 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:58.489261 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:58.489200 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:57:58.490091 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:57:58.490066 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l24v6" Apr 16 16:58:20.373687 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.373653 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:58:20.379177 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.379159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.381956 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.381929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 16:58:20.383100 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.383076 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-sg8s7\"" Apr 16 16:58:20.383203 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.383116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 16:58:20.388793 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.388773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:58:20.456276 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.456371 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.456371 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.456497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.456497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.456497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.456490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2m59\" (UniqueName: \"kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557145 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2m59\" (UniqueName: \"kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557252 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557347 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557437 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557437 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557550 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557550 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557678 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557787 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557767 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.557841 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.557783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.559684 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.559667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.565092 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.565067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2m59\" (UniqueName: \"kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.688649 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.688595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:20.812930 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:20.812872 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:58:20.818280 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:58:20.818248 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c90c54_e83c_4b8a_9d0e_73ca54bc3ae5.slice/crio-c799a71e8a557e2f6669ea3b2212cd8718badebf27e4c8b99260b5e4b805222a WatchSource:0}: Error finding container c799a71e8a557e2f6669ea3b2212cd8718badebf27e4c8b99260b5e4b805222a: Status 404 returned error can't find the container with id c799a71e8a557e2f6669ea3b2212cd8718badebf27e4c8b99260b5e4b805222a Apr 16 16:58:21.555807 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:21.555770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerStarted","Data":"c799a71e8a557e2f6669ea3b2212cd8718badebf27e4c8b99260b5e4b805222a"} Apr 16 16:58:24.566704 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:24.566668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerStarted","Data":"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16"} Apr 16 16:58:25.570919 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:25.570886 2573 generic.go:358] "Generic (PLEG): container finished" podID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerID="eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16" exitCode=0 Apr 16 16:58:25.571402 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:25.570967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerDied","Data":"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16"} Apr 16 16:58:27.581211 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:27.581175 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerStarted","Data":"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805"} Apr 16 16:58:55.672578 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:55.672542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerStarted","Data":"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630"} Apr 16 16:58:55.673054 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:55.672659 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:55.695036 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:55.694992 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" podStartSLOduration=1.321313103 podStartE2EDuration="35.69498045s" podCreationTimestamp="2026-04-16 16:58:20 +0000 UTC" firstStartedPulling="2026-04-16 16:58:20.820119358 +0000 UTC m=+625.960838077" lastFinishedPulling="2026-04-16 16:58:55.193786702 +0000 UTC m=+660.334505424" observedRunningTime="2026-04-16 16:58:55.694471723 +0000 UTC m=+660.835190465" watchObservedRunningTime="2026-04-16 16:58:55.69498045 +0000 UTC m=+660.835699192" Apr 16 16:58:56.678742 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:56.678717 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:58:58.545029 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.544990 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 16:58:58.549108 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.549082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.551809 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.551784 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-spgvr\"" Apr 16 16:58:58.551940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.551861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 16:58:58.559497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.559471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 16:58:58.691185 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj42g\" (UniqueName: \"kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.691185 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.691491 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.691491 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.691491 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.691491 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.691431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792529 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792529 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792770 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj42g\" (UniqueName: \"kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792770 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792770 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.792770 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.793001 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.792964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.793063 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.793035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.793122 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.793067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.793122 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.793093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.795164 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.795104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.808497 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.808466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj42g\" (UniqueName: \"kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.860374 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.860334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:58:58.984585 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.984553 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 16:58:58.987328 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:58:58.987298 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f72e3ef_2fc9_467c_8bfd_540691c4edea.slice/crio-66302956134dca09c8176c99c633ec82ee2ebbf8b387590171fbda6cb224baa1 WatchSource:0}: Error finding container 66302956134dca09c8176c99c633ec82ee2ebbf8b387590171fbda6cb224baa1: Status 404 returned error can't find the container with id 66302956134dca09c8176c99c633ec82ee2ebbf8b387590171fbda6cb224baa1 Apr 16 16:58:58.989732 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:58.989712 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:58:59.687503 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:59.687463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerStarted","Data":"fe405709f9810d90f84efe335128b2c52af8f7c6191c717b11c8618bee3a29ed"} Apr 16 16:58:59.687503 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:58:59.687501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerStarted","Data":"66302956134dca09c8176c99c633ec82ee2ebbf8b387590171fbda6cb224baa1"} Apr 16 16:59:00.688897 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:00.688860 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:00.688897 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:00.688899 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:00.689473 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:00.689175 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.29:8082/healthz\": dial tcp 10.132.0.29:8082: connect: connection refused" Apr 16 16:59:00.691918 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:00.691897 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerID="fe405709f9810d90f84efe335128b2c52af8f7c6191c717b11c8618bee3a29ed" exitCode=0 Apr 16 16:59:00.692002 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:00.691976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerDied","Data":"fe405709f9810d90f84efe335128b2c52af8f7c6191c717b11c8618bee3a29ed"} Apr 16 16:59:01.698430 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:01.698364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerStarted","Data":"c00528ed9105cd2b731dc220c0f181237fefa92f6f0661f695b172ae13a36f87"} Apr 16 16:59:01.698804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:01.698437 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerStarted","Data":"cd6b605a1da88f3105634e918585aaabf0f47c1c212254a00d0453f766a8a68e"} Apr 16 16:59:01.698804 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:01.698644 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:01.720833 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:01.720785 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" podStartSLOduration=3.720773351 podStartE2EDuration="3.720773351s" podCreationTimestamp="2026-04-16 16:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:59:01.71964491 +0000 UTC m=+666.860363651" watchObservedRunningTime="2026-04-16 16:59:01.720773351 +0000 UTC m=+666.861492093" Apr 16 16:59:08.861359 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:08.861327 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:08.861783 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:08.861397 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:08.864042 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:08.864021 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:09.722939 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:09.722913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:10.690997 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:10.690963 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:10.692148 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:10.692131 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:27.378554 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:27.378516 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:59:27.379148 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:27.378898 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="main" containerID="cri-o://3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805" gracePeriod=30 Apr 16 16:59:27.379490 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:27.379416 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="tokenizer" containerID="cri-o://c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630" gracePeriod=30 Apr 16 16:59:27.779740 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:27.779651 2573 generic.go:358] "Generic (PLEG): container finished" podID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerID="3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805" exitCode=0 Apr 16 16:59:27.779740 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:27.779715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerDied","Data":"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805"} Apr 16 16:59:28.639567 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.639545 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:28.733082 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.732994 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733082 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733035 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733082 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733055 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733339 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733097 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733339 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733124 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2m59\" (UniqueName: \"kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733339 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs\") pod \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\" (UID: \"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5\") " Apr 16 16:59:28.733528 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733338 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.733528 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733349 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.733528 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.733823 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.733802 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:28.735486 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.735448 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59" (OuterVolumeSpecName: "kube-api-access-h2m59") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "kube-api-access-h2m59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:59:28.735645 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.735591 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" (UID: "45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:59:28.784667 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.784627 2573 generic.go:358] "Generic (PLEG): container finished" podID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerID="c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630" exitCode=0 Apr 16 16:59:28.784863 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.784681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerDied","Data":"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630"} Apr 16 16:59:28.784863 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.784705 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" Apr 16 16:59:28.784863 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.784721 2573 scope.go:117] "RemoveContainer" containerID="c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630" Apr 16 16:59:28.784863 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.784711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2" event={"ID":"45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5","Type":"ContainerDied","Data":"c799a71e8a557e2f6669ea3b2212cd8718badebf27e4c8b99260b5e4b805222a"} Apr 16 16:59:28.794433 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.794327 2573 scope.go:117] "RemoveContainer" containerID="3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805" Apr 16 16:59:28.802648 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.802623 2573 scope.go:117] "RemoveContainer" containerID="eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16" Apr 16 16:59:28.810420 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.810400 2573 scope.go:117] "RemoveContainer" containerID="c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630" Apr 16 16:59:28.810847 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:59:28.810819 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630\": container with ID starting with c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630 not found: ID does not exist" containerID="c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630" Apr 16 16:59:28.810937 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.810855 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630"} err="failed to get container status \"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630\": rpc error: code = NotFound desc = could not find container \"c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630\": container with ID starting with c7100c7ce125c237349ed176ae0e184a8c2a577f3d957fd639c7821bea5a6630 not found: ID does not exist" Apr 16 16:59:28.810937 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.810874 2573 scope.go:117] "RemoveContainer" containerID="3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805" Apr 16 16:59:28.811288 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:59:28.811259 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805\": container with ID starting with 3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805 not found: ID does not exist" containerID="3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805" Apr 16 16:59:28.811404 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.811295 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805"} err="failed to get container status \"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805\": rpc error: code = NotFound desc = could not find container \"3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805\": container with ID starting with 3f0596e7cd71e294e2665c5b4122cda8c8bf6464d2f3ed7214479eb979f7a805 not found: ID does not exist" Apr 16 16:59:28.811404 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.811318 2573 scope.go:117] "RemoveContainer" containerID="eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16" Apr 16 16:59:28.811923 ip-10-0-138-58 kubenswrapper[2573]: E0416 16:59:28.811615 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16\": container with ID starting with eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16 not found: ID does not exist" containerID="eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16" Apr 16 16:59:28.811923 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.811683 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16"} err="failed to get container status \"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16\": rpc error: code = NotFound desc = could not find container \"eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16\": container with ID starting with eb90549eae563e0b5d256922611bb8b205057cc9157961673a12f2f3c0457a16 not found: ID does not exist" Apr 16 16:59:28.813431 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.813412 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:59:28.819172 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.819150 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5b7ddfpmq2"] Apr 16 16:59:28.833834 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833809 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.833940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833836 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2m59\" (UniqueName: \"kubernetes.io/projected/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kube-api-access-h2m59\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.833940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833852 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.833940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833867 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.833940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833880 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:28.833940 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:28.833892 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 16:59:29.421658 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:29.421623 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" path="/var/lib/kubelet/pods/45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5/volumes" Apr 16 16:59:31.729832 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:31.729797 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 16:59:37.748071 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.747995 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748299 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="main" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748310 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="main" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748321 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="tokenizer" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748327 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="tokenizer" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748340 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="storage-initializer" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748346 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="storage-initializer" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748410 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="main" Apr 16 16:59:37.748539 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.748419 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="45c90c54-e83c-4b8a-9d0e-73ca54bc3ae5" containerName="tokenizer" Apr 16 16:59:37.750293 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.750276 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.753240 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.753215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 16:59:37.754521 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.754499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-tkh64\"" Apr 16 16:59:37.767126 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.767105 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 16:59:37.800106 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.800213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.800213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.800213 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsdbk\" (UniqueName: \"kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.800354 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.800354 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.800298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901122 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdbk\" (UniqueName: \"kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901233 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901489 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901608 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901665 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901624 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901702 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901662 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.901702 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.901686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.903910 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.903889 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:37.909741 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:37.909712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdbk\" (UniqueName: \"kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:38.061296 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:38.061268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:38.194437 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:38.194415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 16:59:38.196974 ip-10-0-138-58 kubenswrapper[2573]: W0416 16:59:38.196935 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaea1976_c7a6_4208_bc00_001ae053f1fa.slice/crio-8196d72bfe0bfeff1b5f7d8bc43c13170890d46b8df24f59fb4b98a578566c68 WatchSource:0}: Error finding container 8196d72bfe0bfeff1b5f7d8bc43c13170890d46b8df24f59fb4b98a578566c68: Status 404 returned error can't find the container with id 8196d72bfe0bfeff1b5f7d8bc43c13170890d46b8df24f59fb4b98a578566c68 Apr 16 16:59:38.820341 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:38.820306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerStarted","Data":"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc"} Apr 16 16:59:38.820341 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:38.820343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerStarted","Data":"8196d72bfe0bfeff1b5f7d8bc43c13170890d46b8df24f59fb4b98a578566c68"} Apr 16 16:59:39.825335 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:39.825299 2573 generic.go:358] "Generic (PLEG): container finished" podID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerID="1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc" exitCode=0 Apr 16 16:59:39.825751 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:39.825346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerDied","Data":"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc"} Apr 16 16:59:40.834673 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:40.834627 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerStarted","Data":"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e"} Apr 16 16:59:40.834673 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:40.834673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerStarted","Data":"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d"} Apr 16 16:59:40.835099 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:40.834760 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:40.860827 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:40.860782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" podStartSLOduration=3.860768026 podStartE2EDuration="3.860768026s" podCreationTimestamp="2026-04-16 16:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:59:40.85843785 +0000 UTC m=+705.999156597" watchObservedRunningTime="2026-04-16 16:59:40.860768026 +0000 UTC m=+706.001486768" Apr 16 16:59:48.061964 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:48.061921 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:48.061964 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:48.061961 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:48.064895 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:48.064869 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 16:59:48.861161 ip-10-0-138-58 kubenswrapper[2573]: I0416 16:59:48.861130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 17:00:09.865243 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:09.865214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 17:00:11.287315 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:11.287281 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 17:00:11.287777 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:11.287598 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="main" containerID="cri-o://7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d" gracePeriod=30 Apr 16 17:00:11.287777 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:11.287695 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="tokenizer" containerID="cri-o://34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e" gracePeriod=30 Apr 16 17:00:11.948552 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:11.948514 2573 generic.go:358] "Generic (PLEG): container finished" podID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerID="7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d" exitCode=0 Apr 16 17:00:11.948734 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:11.948585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerDied","Data":"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d"} Apr 16 17:00:12.336244 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.336224 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 17:00:12.456134 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456362 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456133 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsdbk\" (UniqueName: \"kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456362 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456163 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456362 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456205 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456666 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456494 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:12.456666 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456506 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:12.456666 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456603 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456864 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456691 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs\") pod \"baea1976-c7a6-4208-bc00-001ae053f1fa\" (UID: \"baea1976-c7a6-4208-bc00-001ae053f1fa\") " Apr 16 17:00:12.456942 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456922 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.457000 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456947 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.457000 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.456969 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:12.457126 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.457100 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:12.458509 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.458488 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk" (OuterVolumeSpecName: "kube-api-access-fsdbk") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "kube-api-access-fsdbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:00:12.458768 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.458718 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "baea1976-c7a6-4208-bc00-001ae053f1fa" (UID: "baea1976-c7a6-4208-bc00-001ae053f1fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:00:12.558182 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.558148 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsdbk\" (UniqueName: \"kubernetes.io/projected/baea1976-c7a6-4208-bc00-001ae053f1fa-kube-api-access-fsdbk\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.558182 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.558176 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.558182 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.558186 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/baea1976-c7a6-4208-bc00-001ae053f1fa-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.558182 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.558195 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/baea1976-c7a6-4208-bc00-001ae053f1fa-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:00:12.955092 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.955062 2573 generic.go:358] "Generic (PLEG): container finished" podID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerID="34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e" exitCode=0 Apr 16 17:00:12.955220 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.955148 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" Apr 16 17:00:12.955277 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.955148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerDied","Data":"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e"} Apr 16 17:00:12.955277 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.955261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69" event={"ID":"baea1976-c7a6-4208-bc00-001ae053f1fa","Type":"ContainerDied","Data":"8196d72bfe0bfeff1b5f7d8bc43c13170890d46b8df24f59fb4b98a578566c68"} Apr 16 17:00:12.955360 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.955278 2573 scope.go:117] "RemoveContainer" containerID="34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e" Apr 16 17:00:12.964540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.964516 2573 scope.go:117] "RemoveContainer" containerID="7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d" Apr 16 17:00:12.971539 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.971521 2573 scope.go:117] "RemoveContainer" containerID="1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc" Apr 16 17:00:12.977598 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.977575 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 17:00:12.978980 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.978963 2573 scope.go:117] "RemoveContainer" containerID="34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e" Apr 16 17:00:12.979394 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:00:12.979348 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e\": container with ID starting with 34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e not found: ID does not exist" containerID="34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e" Apr 16 17:00:12.979956 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.979411 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e"} err="failed to get container status \"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e\": rpc error: code = NotFound desc = could not find container \"34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e\": container with ID starting with 34468f483ef004486bba3c118b8716239ed2395cdc17b133bf4f6a0f5ec3511e not found: ID does not exist" Apr 16 17:00:12.979956 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.979439 2573 scope.go:117] "RemoveContainer" containerID="7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d" Apr 16 17:00:12.980083 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:00:12.980015 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d\": container with ID starting with 7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d not found: ID does not exist" containerID="7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d" Apr 16 17:00:12.980083 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.980049 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d"} err="failed to get container status \"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d\": rpc error: code = NotFound desc = could not find container \"7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d\": container with ID starting with 7ce5f224780225b610d45d66270f1051ab10b835299165c817fceb18fabe5a2d not found: ID does not exist" Apr 16 17:00:12.980083 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.980067 2573 scope.go:117] "RemoveContainer" containerID="1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc" Apr 16 17:00:12.980404 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:00:12.980366 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc\": container with ID starting with 1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc not found: ID does not exist" containerID="1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc" Apr 16 17:00:12.980475 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.980410 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc"} err="failed to get container status \"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc\": rpc error: code = NotFound desc = could not find container \"1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc\": container with ID starting with 1f59c3a3268333b6db167cd88bd5125d0529c45c971efbc342dad3482fc6dbcc not found: ID does not exist" Apr 16 17:00:12.981609 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:12.981590 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7675b4fwgm69"] Apr 16 17:00:13.421344 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:13.421309 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" path="/var/lib/kubelet/pods/baea1976-c7a6-4208-bc00-001ae053f1fa/volumes" Apr 16 17:00:31.845745 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.845709 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846053 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="storage-initializer" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846067 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="storage-initializer" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846080 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="main" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846086 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="main" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846093 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="tokenizer" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846100 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="tokenizer" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846156 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="main" Apr 16 17:00:31.846242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.846164 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="baea1976-c7a6-4208-bc00-001ae053f1fa" containerName="tokenizer" Apr 16 17:00:31.849414 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.849364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:31.852440 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.852418 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 17:00:31.858322 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:31.858290 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:00:32.003101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.003236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.003236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.003236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.003236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.003236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.003213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkx8\" (UniqueName: \"kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.068202 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.068175 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:00:32.070606 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.070587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.073208 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.073183 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-w7pcr\"" Apr 16 17:00:32.081988 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.081969 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:00:32.104457 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104545 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104545 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104545 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104659 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdkx8\" (UniqueName: \"kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104659 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104760 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.104872 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.104840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.106826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.106808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.106957 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.106897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.112168 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.112144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdkx8\" (UniqueName: \"kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8\") pod \"precise-prefix-cache-test-kserve-7db845475b-cn8h6\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.162309 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.162290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:32.205371 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.205514 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.205514 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205441 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.205514 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2bf\" (UniqueName: \"kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.205693 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.205693 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.205600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.281217 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.281193 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:00:32.283683 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:00:32.283655 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd048e566_1308_419c_8376_782218bc4dd0.slice/crio-81083e0bcc9085ac798d17013793cd8f4f1756020fb5265312492d15ef46d550 WatchSource:0}: Error finding container 81083e0bcc9085ac798d17013793cd8f4f1756020fb5265312492d15ef46d550: Status 404 returned error can't find the container with id 81083e0bcc9085ac798d17013793cd8f4f1756020fb5265312492d15ef46d550 Apr 16 17:00:32.307295 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307431 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307431 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307431 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2bf\" (UniqueName: \"kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307431 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307660 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307755 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307875 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307875 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.307875 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.307853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.309998 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.309974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.316049 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.316031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2bf\" (UniqueName: \"kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.379971 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.379917 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:32.514337 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:32.514306 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:00:32.515899 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:00:32.515873 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c9bc31e_27c5_4d8e_9f47_46bda0a399c3.slice/crio-9c3d4a0c5424f44cff377ede02e4c25c921660f28fd18db0c91bcab5c3e2a726 WatchSource:0}: Error finding container 9c3d4a0c5424f44cff377ede02e4c25c921660f28fd18db0c91bcab5c3e2a726: Status 404 returned error can't find the container with id 9c3d4a0c5424f44cff377ede02e4c25c921660f28fd18db0c91bcab5c3e2a726 Apr 16 17:00:33.019962 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:33.019918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerStarted","Data":"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18"} Apr 16 17:00:33.020407 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:33.019968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerStarted","Data":"81083e0bcc9085ac798d17013793cd8f4f1756020fb5265312492d15ef46d550"} Apr 16 17:00:33.024547 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:33.024506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerStarted","Data":"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1"} Apr 16 17:00:33.024694 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:33.024541 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerStarted","Data":"9c3d4a0c5424f44cff377ede02e4c25c921660f28fd18db0c91bcab5c3e2a726"} Apr 16 17:00:34.029480 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:34.029443 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerID="fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1" exitCode=0 Apr 16 17:00:34.029480 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:34.029535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerDied","Data":"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1"} Apr 16 17:00:35.035243 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:35.035207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerStarted","Data":"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339"} Apr 16 17:00:35.035243 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:35.035240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerStarted","Data":"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b"} Apr 16 17:00:35.035825 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:35.035263 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:35.058857 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:35.058799 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" podStartSLOduration=3.058778336 podStartE2EDuration="3.058778336s" podCreationTimestamp="2026-04-16 17:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:00:35.055446148 +0000 UTC m=+760.196164894" watchObservedRunningTime="2026-04-16 17:00:35.058778336 +0000 UTC m=+760.199497079" Apr 16 17:00:37.044066 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:37.044025 2573 generic.go:358] "Generic (PLEG): container finished" podID="d048e566-1308-419c-8376-782218bc4dd0" containerID="7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18" exitCode=0 Apr 16 17:00:37.044066 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:37.044060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerDied","Data":"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18"} Apr 16 17:00:39.053961 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:39.053925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerStarted","Data":"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416"} Apr 16 17:00:39.076151 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:39.076104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" podStartSLOduration=6.931077513 podStartE2EDuration="8.076092545s" podCreationTimestamp="2026-04-16 17:00:31 +0000 UTC" firstStartedPulling="2026-04-16 17:00:37.045321453 +0000 UTC m=+762.186040177" lastFinishedPulling="2026-04-16 17:00:38.190336486 +0000 UTC m=+763.331055209" observedRunningTime="2026-04-16 17:00:39.074170772 +0000 UTC m=+764.214889514" watchObservedRunningTime="2026-04-16 17:00:39.076092545 +0000 UTC m=+764.216811287" Apr 16 17:00:42.163330 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.163292 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:42.163330 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.163337 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:42.176008 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.175981 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:42.380734 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.380703 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:42.380877 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.380741 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:42.381801 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:00:42.381779 2573 logging.go:55] [core] [Channel #104 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 16 17:00:42.383164 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:42.383142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:43.068913 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:43.068878 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:00:43.079126 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:43.079099 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:00:43.381828 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:43.381737 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 16 17:00:52.380654 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:00:52.380625 2573 logging.go:55] [core] [Channel #112 SubChannel #113]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 16 17:00:53.381447 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:00:53.381397 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 16 17:00:53.381942 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:00:53.381492 2573 logging.go:55] [core] [Channel #112 SubChannel #113]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 16 17:01:04.071846 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:04.071773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:01:05.291442 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.291408 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:01:05.294735 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.294714 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:01:05.295581 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.295182 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="main" containerID="cri-o://6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416" gracePeriod=30 Apr 16 17:01:05.296336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.295865 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" containerID="cri-o://8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b" gracePeriod=30 Apr 16 17:01:05.300858 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.295982 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="tokenizer" containerID="cri-o://4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339" gracePeriod=30 Apr 16 17:01:05.548828 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.548769 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:01:05.692325 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692291 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692325 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692329 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692583 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692355 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692583 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692405 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692583 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692472 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692583 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692538 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdkx8\" (UniqueName: \"kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8\") pod \"d048e566-1308-419c-8376-782218bc4dd0\" (UID: \"d048e566-1308-419c-8376-782218bc4dd0\") " Apr 16 17:01:05.692843 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692588 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home" (OuterVolumeSpecName: "home") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.692843 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692636 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache" (OuterVolumeSpecName: "model-cache") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.692843 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692823 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-home\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.692843 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.692841 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-model-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.694622 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.694591 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8" (OuterVolumeSpecName: "kube-api-access-fdkx8") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "kube-api-access-fdkx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:01:05.694622 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.694610 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm" (OuterVolumeSpecName: "dshm") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.694896 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.694868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:01:05.747210 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.747165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d048e566-1308-419c-8376-782218bc4dd0" (UID: "d048e566-1308-419c-8376-782218bc4dd0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:05.794214 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.794177 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d048e566-1308-419c-8376-782218bc4dd0-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.794214 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.794212 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdkx8\" (UniqueName: \"kubernetes.io/projected/d048e566-1308-419c-8376-782218bc4dd0-kube-api-access-fdkx8\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.794214 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.794223 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:05.794450 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:05.794232 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d048e566-1308-419c-8376-782218bc4dd0-dshm\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.148527 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.148488 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerID="8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b" exitCode=0 Apr 16 17:01:06.148527 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.148522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerDied","Data":"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b"} Apr 16 17:01:06.149932 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.149910 2573 generic.go:358] "Generic (PLEG): container finished" podID="d048e566-1308-419c-8376-782218bc4dd0" containerID="6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416" exitCode=0 Apr 16 17:01:06.150039 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.149943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerDied","Data":"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416"} Apr 16 17:01:06.150039 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.149963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" event={"ID":"d048e566-1308-419c-8376-782218bc4dd0","Type":"ContainerDied","Data":"81083e0bcc9085ac798d17013793cd8f4f1756020fb5265312492d15ef46d550"} Apr 16 17:01:06.150039 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.149979 2573 scope.go:117] "RemoveContainer" containerID="6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416" Apr 16 17:01:06.150039 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.150002 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6" Apr 16 17:01:06.158028 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.158006 2573 scope.go:117] "RemoveContainer" containerID="7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18" Apr 16 17:01:06.168997 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.168980 2573 scope.go:117] "RemoveContainer" containerID="6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416" Apr 16 17:01:06.169233 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:01:06.169213 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416\": container with ID starting with 6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416 not found: ID does not exist" containerID="6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416" Apr 16 17:01:06.169288 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.169241 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416"} err="failed to get container status \"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416\": rpc error: code = NotFound desc = could not find container \"6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416\": container with ID starting with 6991b003f6cf0b35c90c1a16d7684ae98013ebb85fd1dcb3d991757b6e79e416 not found: ID does not exist" Apr 16 17:01:06.169288 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.169259 2573 scope.go:117] "RemoveContainer" containerID="7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18" Apr 16 17:01:06.169513 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:01:06.169496 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18\": container with ID starting with 7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18 not found: ID does not exist" containerID="7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18" Apr 16 17:01:06.169578 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.169519 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18"} err="failed to get container status \"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18\": rpc error: code = NotFound desc = could not find container \"7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18\": container with ID starting with 7ffe7e08c8182a91a6d423015792950a7dfd3cc4a8f915882f638b2b93b69c18 not found: ID does not exist" Apr 16 17:01:06.173693 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.173610 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:01:06.175339 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.175321 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7db845475b-cn8h6"] Apr 16 17:01:06.736572 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.736553 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:01:06.801770 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801743 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.801770 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.801994 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801826 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.801994 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801846 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.801994 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801869 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz2bf\" (UniqueName: \"kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.801994 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.801894 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location\") pod \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\" (UID: \"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3\") " Apr 16 17:01:06.802214 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.802176 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:06.802272 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.802183 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:06.802272 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.802201 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:06.802701 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.802679 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:06.804534 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.804511 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:01:06.804534 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.804520 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf" (OuterVolumeSpecName: "kube-api-access-rz2bf") pod "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" (UID: "8c9bc31e-27c5-4d8e-9f47-46bda0a399c3"). InnerVolumeSpecName "kube-api-access-rz2bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903227 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903249 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rz2bf\" (UniqueName: \"kubernetes.io/projected/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kube-api-access-rz2bf\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903260 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903269 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903278 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:06.903292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:06.903286 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:07.155793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.155708 2573 generic.go:358] "Generic (PLEG): container finished" podID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerID="4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339" exitCode=0 Apr 16 17:01:07.155793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.155751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerDied","Data":"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339"} Apr 16 17:01:07.155793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.155772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" event={"ID":"8c9bc31e-27c5-4d8e-9f47-46bda0a399c3","Type":"ContainerDied","Data":"9c3d4a0c5424f44cff377ede02e4c25c921660f28fd18db0c91bcab5c3e2a726"} Apr 16 17:01:07.155793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.155790 2573 scope.go:117] "RemoveContainer" containerID="4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339" Apr 16 17:01:07.156050 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.155791 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8" Apr 16 17:01:07.164987 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.164964 2573 scope.go:117] "RemoveContainer" containerID="8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b" Apr 16 17:01:07.171984 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.171967 2573 scope.go:117] "RemoveContainer" containerID="fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1" Apr 16 17:01:07.179348 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.179324 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:01:07.180025 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.180009 2573 scope.go:117] "RemoveContainer" containerID="4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339" Apr 16 17:01:07.180309 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:01:07.180288 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339\": container with ID starting with 4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339 not found: ID does not exist" containerID="4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339" Apr 16 17:01:07.180412 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.180320 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339"} err="failed to get container status \"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339\": rpc error: code = NotFound desc = could not find container \"4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339\": container with ID starting with 4cb160640210961bae4af79b3b909a2d2c426096f8f9c19fa08f4c91f4814339 not found: ID does not exist" Apr 16 17:01:07.180412 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.180347 2573 scope.go:117] "RemoveContainer" containerID="8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b" Apr 16 17:01:07.180805 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:01:07.180785 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b\": container with ID starting with 8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b not found: ID does not exist" containerID="8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b" Apr 16 17:01:07.180895 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.180809 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b"} err="failed to get container status \"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b\": rpc error: code = NotFound desc = could not find container \"8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b\": container with ID starting with 8b09cf2f87bc8e024e9a386160f23df18d5af6c1487ea45f025fd9ca3fa45c2b not found: ID does not exist" Apr 16 17:01:07.180895 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.180826 2573 scope.go:117] "RemoveContainer" containerID="fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1" Apr 16 17:01:07.181076 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:01:07.181057 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1\": container with ID starting with fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1 not found: ID does not exist" containerID="fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1" Apr 16 17:01:07.181119 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.181081 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1"} err="failed to get container status \"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1\": rpc error: code = NotFound desc = could not find container \"fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1\": container with ID starting with fa11d0a54655823a15f3be6cec1e96ff5e29688ffcd9cee6375b82be4d7638f1 not found: ID does not exist" Apr 16 17:01:07.185225 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.185205 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cb5747bc247r8"] Apr 16 17:01:07.426710 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.426603 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" path="/var/lib/kubelet/pods/8c9bc31e-27c5-4d8e-9f47-46bda0a399c3/volumes" Apr 16 17:01:07.427438 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:07.427423 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d048e566-1308-419c-8376-782218bc4dd0" path="/var/lib/kubelet/pods/d048e566-1308-419c-8376-782218bc4dd0/volumes" Apr 16 17:01:40.936083 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936051 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936368 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="tokenizer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936404 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="tokenizer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936430 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936436 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936446 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="storage-initializer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936452 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="storage-initializer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936457 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="storage-initializer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936463 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="storage-initializer" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936469 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="main" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936474 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="main" Apr 16 17:01:40.936530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936534 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d048e566-1308-419c-8376-782218bc4dd0" containerName="main" Apr 16 17:01:40.936891 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936545 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="main" Apr 16 17:01:40.936891 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.936551 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c9bc31e-27c5-4d8e-9f47-46bda0a399c3" containerName="tokenizer" Apr 16 17:01:40.939310 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.939294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.942352 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.942119 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 17:01:40.942352 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.942161 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-lkwff\"" Apr 16 17:01:40.948059 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.948019 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:01:40.961849 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.961827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.961977 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.961882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.961977 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.961902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.961977 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.961927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.962085 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.962021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:40.962085 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:40.962055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlg54\" (UniqueName: \"kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062685 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlg54\" (UniqueName: \"kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062926 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.062926 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.062871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.063111 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.063091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.063179 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.063134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.063179 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.063164 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.063275 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.063219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.065169 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.065148 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.072002 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.071981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlg54\" (UniqueName: \"kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.250360 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.250291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:41.405112 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:41.405083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:01:41.412801 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:01:41.412770 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d23b6ec_8709_4840_aa58_3f21e0ced27e.slice/crio-9b62d44beba5282fd3890c943525ea31c8b052b388b1601abd8268103d38adf6 WatchSource:0}: Error finding container 9b62d44beba5282fd3890c943525ea31c8b052b388b1601abd8268103d38adf6: Status 404 returned error can't find the container with id 9b62d44beba5282fd3890c943525ea31c8b052b388b1601abd8268103d38adf6 Apr 16 17:01:42.275170 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:42.275138 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerID="57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836" exitCode=0 Apr 16 17:01:42.275522 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:42.275178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerDied","Data":"57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836"} Apr 16 17:01:42.275522 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:42.275202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerStarted","Data":"9b62d44beba5282fd3890c943525ea31c8b052b388b1601abd8268103d38adf6"} Apr 16 17:01:43.280540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:43.280505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerStarted","Data":"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c"} Apr 16 17:01:43.280540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:43.280540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerStarted","Data":"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59"} Apr 16 17:01:43.281059 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:43.280682 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:43.300883 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:43.300647 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" podStartSLOduration=3.300631212 podStartE2EDuration="3.300631212s" podCreationTimestamp="2026-04-16 17:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:01:43.300209591 +0000 UTC m=+828.440928336" watchObservedRunningTime="2026-04-16 17:01:43.300631212 +0000 UTC m=+828.441349956" Apr 16 17:01:51.251325 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:51.251286 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:51.251325 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:51.251330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:51.254171 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:51.254147 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:51.309138 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:51.309110 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:01:52.294743 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:52.294713 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 17:01:52.295232 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:52.295168 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="tokenizer" containerID="cri-o://c00528ed9105cd2b731dc220c0f181237fefa92f6f0661f695b172ae13a36f87" gracePeriod=30 Apr 16 17:01:52.295951 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:52.295825 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="main" containerID="cri-o://cd6b605a1da88f3105634e918585aaabf0f47c1c212254a00d0453f766a8a68e" gracePeriod=30 Apr 16 17:01:53.316838 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.316809 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerID="c00528ed9105cd2b731dc220c0f181237fefa92f6f0661f695b172ae13a36f87" exitCode=0 Apr 16 17:01:53.316838 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.316837 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerID="cd6b605a1da88f3105634e918585aaabf0f47c1c212254a00d0453f766a8a68e" exitCode=0 Apr 16 17:01:53.317192 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.316864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerDied","Data":"c00528ed9105cd2b731dc220c0f181237fefa92f6f0661f695b172ae13a36f87"} Apr 16 17:01:53.317192 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.316892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerDied","Data":"cd6b605a1da88f3105634e918585aaabf0f47c1c212254a00d0453f766a8a68e"} Apr 16 17:01:53.343852 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.343828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 17:01:53.462889 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462832 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj42g\" (UniqueName: \"kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.462889 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462878 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.463074 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.463074 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462935 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.463074 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462953 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.463074 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.462969 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp\") pod \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\" (UID: \"5f72e3ef-2fc9-467c-8bfd-540691c4edea\") " Apr 16 17:01:53.463284 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463176 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:53.463284 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463195 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:53.463439 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463349 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:53.463439 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463369 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:53.463439 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463369 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:53.463677 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.463655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:53.464993 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.464971 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g" (OuterVolumeSpecName: "kube-api-access-qj42g") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "kube-api-access-qj42g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:01:53.465197 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.465178 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5f72e3ef-2fc9-467c-8bfd-540691c4edea" (UID: "5f72e3ef-2fc9-467c-8bfd-540691c4edea"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:01:53.564260 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.564239 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:53.564260 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.564260 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:53.564401 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.564271 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f72e3ef-2fc9-467c-8bfd-540691c4edea-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:53.564401 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:53.564279 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qj42g\" (UniqueName: \"kubernetes.io/projected/5f72e3ef-2fc9-467c-8bfd-540691c4edea-kube-api-access-qj42g\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:01:54.321392 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.321353 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" Apr 16 17:01:54.321797 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.321353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr" event={"ID":"5f72e3ef-2fc9-467c-8bfd-540691c4edea","Type":"ContainerDied","Data":"66302956134dca09c8176c99c633ec82ee2ebbf8b387590171fbda6cb224baa1"} Apr 16 17:01:54.321797 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.321434 2573 scope.go:117] "RemoveContainer" containerID="c00528ed9105cd2b731dc220c0f181237fefa92f6f0661f695b172ae13a36f87" Apr 16 17:01:54.330625 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.330470 2573 scope.go:117] "RemoveContainer" containerID="cd6b605a1da88f3105634e918585aaabf0f47c1c212254a00d0453f766a8a68e" Apr 16 17:01:54.337292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.337272 2573 scope.go:117] "RemoveContainer" containerID="fe405709f9810d90f84efe335128b2c52af8f7c6191c717b11c8618bee3a29ed" Apr 16 17:01:54.343262 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.343239 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 17:01:54.347892 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:54.347875 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekzsdr"] Apr 16 17:01:55.420659 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:01:55.420629 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" path="/var/lib/kubelet/pods/5f72e3ef-2fc9-467c-8bfd-540691c4edea/volumes" Apr 16 17:02:00.305404 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305344 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305678 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="tokenizer" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305691 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="tokenizer" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305721 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="storage-initializer" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305727 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="storage-initializer" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305737 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="main" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305742 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="main" Apr 16 17:02:00.305795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305797 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="main" Apr 16 17:02:00.306051 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.305809 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f72e3ef-2fc9-467c-8bfd-540691c4edea" containerName="tokenizer" Apr 16 17:02:00.310774 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.310748 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.313583 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.313548 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 17:02:00.313703 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.313649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-2rxpm\"" Apr 16 17:02:00.320793 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.319690 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:02:00.414995 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.414969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.415109 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.415007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxsp\" (UniqueName: \"kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.415109 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.415029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.415109 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.415084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.415223 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.415117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.415223 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.415135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.515795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.515768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.515897 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.515806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxsp\" (UniqueName: \"kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.515897 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.515826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.515981 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.515942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516032 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.515980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516032 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.516006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516129 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.516112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516196 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.516174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516264 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.516248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.516318 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.516285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.518342 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.518320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.524600 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.524581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxsp\" (UniqueName: \"kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.624254 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.624202 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:00.744337 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:00.744241 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:02:00.746665 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:02:00.746638 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6957063a_60e4_4d44_9b7e_97a59c1a183d.slice/crio-25d1ffc6f3fe59ef3c4c8bd26dd9ea0427af571211e72d360a09de1f7fcaabe4 WatchSource:0}: Error finding container 25d1ffc6f3fe59ef3c4c8bd26dd9ea0427af571211e72d360a09de1f7fcaabe4: Status 404 returned error can't find the container with id 25d1ffc6f3fe59ef3c4c8bd26dd9ea0427af571211e72d360a09de1f7fcaabe4 Apr 16 17:02:01.348237 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:01.348202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerStarted","Data":"54f0ca8037f6028d96d332a06f8d3d3fc070ae6fa6323e5d8f1887f53004f330"} Apr 16 17:02:01.348237 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:01.348240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerStarted","Data":"25d1ffc6f3fe59ef3c4c8bd26dd9ea0427af571211e72d360a09de1f7fcaabe4"} Apr 16 17:02:02.352998 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:02.352963 2573 generic.go:358] "Generic (PLEG): container finished" podID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerID="54f0ca8037f6028d96d332a06f8d3d3fc070ae6fa6323e5d8f1887f53004f330" exitCode=0 Apr 16 17:02:02.353414 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:02.353054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerDied","Data":"54f0ca8037f6028d96d332a06f8d3d3fc070ae6fa6323e5d8f1887f53004f330"} Apr 16 17:02:03.358885 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:03.358848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerStarted","Data":"78f388f9d5a26bb6eb8f668d27dff6ab0cee7b2fe8abb84ed857db677e089aea"} Apr 16 17:02:03.359271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:03.358890 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerStarted","Data":"d37d37efc5985783649d4a7e52d87618f6f121342b45dcfff0afb2f4b08e01df"} Apr 16 17:02:03.359271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:03.358982 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:03.381001 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:03.380959 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" podStartSLOduration=3.380946825 podStartE2EDuration="3.380946825s" podCreationTimestamp="2026-04-16 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:02:03.379423396 +0000 UTC m=+848.520142139" watchObservedRunningTime="2026-04-16 17:02:03.380946825 +0000 UTC m=+848.521665567" Apr 16 17:02:10.624546 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:10.624513 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:10.624546 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:10.624553 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:10.627300 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:10.627274 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:11.384639 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:11.384613 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:12.313409 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:12.313365 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:02:32.388286 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:32.388214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:02:55.374436 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:55.374327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:02:55.375001 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:02:55.374812 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:03:43.006683 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:43.006654 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:03:43.007121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:43.006964 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="main" containerID="cri-o://ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59" gracePeriod=30 Apr 16 17:03:43.007121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:43.007104 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="tokenizer" containerID="cri-o://0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c" gracePeriod=30 Apr 16 17:03:43.685265 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:43.685236 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerID="ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59" exitCode=0 Apr 16 17:03:43.685455 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:43.685317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerDied","Data":"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59"} Apr 16 17:03:44.065003 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.064984 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:03:44.178939 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.178855 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179050 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179050 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179035 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179127 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179061 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179127 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179114 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179212 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179166 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlg54\" (UniqueName: \"kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54\") pod \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\" (UID: \"8d23b6ec-8709-4840-aa58-3f21e0ced27e\") " Apr 16 17:03:44.179366 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179337 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:44.179474 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179436 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:44.179523 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179464 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:44.179768 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.179748 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:44.180926 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.180902 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:03:44.181134 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.181118 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54" (OuterVolumeSpecName: "kube-api-access-wlg54") pod "8d23b6ec-8709-4840-aa58-3f21e0ced27e" (UID: "8d23b6ec-8709-4840-aa58-3f21e0ced27e"). InnerVolumeSpecName "kube-api-access-wlg54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:03:44.280191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280140 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.280191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280162 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlg54\" (UniqueName: \"kubernetes.io/projected/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kube-api-access-wlg54\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.280191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280172 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.280191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280182 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.280191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280191 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.280411 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.280199 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d23b6ec-8709-4840-aa58-3f21e0ced27e-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:03:44.690151 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.690125 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerID="0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c" exitCode=0 Apr 16 17:03:44.690299 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.690164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerDied","Data":"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c"} Apr 16 17:03:44.690299 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.690185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" event={"ID":"8d23b6ec-8709-4840-aa58-3f21e0ced27e","Type":"ContainerDied","Data":"9b62d44beba5282fd3890c943525ea31c8b052b388b1601abd8268103d38adf6"} Apr 16 17:03:44.690299 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.690200 2573 scope.go:117] "RemoveContainer" containerID="0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c" Apr 16 17:03:44.690299 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.690197 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl" Apr 16 17:03:44.698860 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.698756 2573 scope.go:117] "RemoveContainer" containerID="ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59" Apr 16 17:03:44.705856 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.705840 2573 scope.go:117] "RemoveContainer" containerID="57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836" Apr 16 17:03:44.712563 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.712541 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:03:44.712935 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.712921 2573 scope.go:117] "RemoveContainer" containerID="0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c" Apr 16 17:03:44.713193 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:03:44.713176 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c\": container with ID starting with 0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c not found: ID does not exist" containerID="0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c" Apr 16 17:03:44.713266 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.713204 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c"} err="failed to get container status \"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c\": rpc error: code = NotFound desc = could not find container \"0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c\": container with ID starting with 0c1938812c1a5d88292cede90db5c095410098baed9a06dc3dcff4073fbfef0c not found: ID does not exist" Apr 16 17:03:44.713266 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.713227 2573 scope.go:117] "RemoveContainer" containerID="ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59" Apr 16 17:03:44.713526 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:03:44.713508 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59\": container with ID starting with ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59 not found: ID does not exist" containerID="ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59" Apr 16 17:03:44.713578 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.713532 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59"} err="failed to get container status \"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59\": rpc error: code = NotFound desc = could not find container \"ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59\": container with ID starting with ba72fb95defc0b79154b441825c3bf7315349fdbf3f7173d455460c35e3d6a59 not found: ID does not exist" Apr 16 17:03:44.713578 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.713549 2573 scope.go:117] "RemoveContainer" containerID="57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836" Apr 16 17:03:44.713780 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:03:44.713763 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836\": container with ID starting with 57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836 not found: ID does not exist" containerID="57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836" Apr 16 17:03:44.713819 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.713786 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836"} err="failed to get container status \"57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836\": rpc error: code = NotFound desc = could not find container \"57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836\": container with ID starting with 57bb8c2ae8a14566712560f20c517ac59735695a0f18950bd2824436c4e9c836 not found: ID does not exist" Apr 16 17:03:44.716991 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:44.716970 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-zg9vl"] Apr 16 17:03:45.420915 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:45.420884 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" path="/var/lib/kubelet/pods/8d23b6ec-8709-4840-aa58-3f21e0ced27e/volumes" Apr 16 17:03:52.259334 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259768 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="tokenizer" Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259788 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="tokenizer" Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259805 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="main" Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259813 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="main" Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259848 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="storage-initializer" Apr 16 17:03:52.259930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259856 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="storage-initializer" Apr 16 17:03:52.260231 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259937 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="tokenizer" Apr 16 17:03:52.260231 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.259949 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d23b6ec-8709-4840-aa58-3f21e0ced27e" containerName="main" Apr 16 17:03:52.265036 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.265013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.267660 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.267633 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-lh5xc\"" Apr 16 17:03:52.267780 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.267693 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 17:03:52.271121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.271098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:03:52.337070 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.337161 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.337161 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.337161 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.337262 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.337262 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.337197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzdj\" (UniqueName: \"kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.437997 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.437971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzdj\" (UniqueName: \"kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438244 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438244 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438438 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438482 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438515 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.438582 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.438564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.440598 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.440579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.445594 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.445577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzdj\" (UniqueName: \"kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj\") pod \"stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.576767 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.576743 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:52.901167 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:52.901088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:03:52.904522 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:03:52.904490 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50631686_fdca_4bcb_90ad_e0445b733f21.slice/crio-f2a0a2064b9cdf8d4ecfdbbbfd70b77606b20f7141246eae06fd721e9c9c2521 WatchSource:0}: Error finding container f2a0a2064b9cdf8d4ecfdbbbfd70b77606b20f7141246eae06fd721e9c9c2521: Status 404 returned error can't find the container with id f2a0a2064b9cdf8d4ecfdbbbfd70b77606b20f7141246eae06fd721e9c9c2521 Apr 16 17:03:53.728571 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:53.728535 2573 generic.go:358] "Generic (PLEG): container finished" podID="50631686-fdca-4bcb-90ad-e0445b733f21" containerID="94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98" exitCode=0 Apr 16 17:03:53.728877 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:53.728617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerDied","Data":"94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98"} Apr 16 17:03:53.728877 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:53.728652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerStarted","Data":"f2a0a2064b9cdf8d4ecfdbbbfd70b77606b20f7141246eae06fd721e9c9c2521"} Apr 16 17:03:54.734208 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:54.734178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerStarted","Data":"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4"} Apr 16 17:03:54.734208 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:54.734213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerStarted","Data":"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9"} Apr 16 17:03:54.734667 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:54.734295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:03:54.755560 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:03:54.755520 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" podStartSLOduration=2.75550388 podStartE2EDuration="2.75550388s" podCreationTimestamp="2026-04-16 17:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:03:54.754976577 +0000 UTC m=+959.895695328" watchObservedRunningTime="2026-04-16 17:03:54.75550388 +0000 UTC m=+959.896222619" Apr 16 17:04:02.577577 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:02.577529 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:04:02.577577 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:02.577583 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:04:02.580427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:02.580400 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:04:02.761969 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:02.761947 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:04:07.856571 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:07.856499 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:04:07.856937 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:07.856823 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="main" containerID="cri-o://d37d37efc5985783649d4a7e52d87618f6f121342b45dcfff0afb2f4b08e01df" gracePeriod=30 Apr 16 17:04:07.856937 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:07.856871 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="tokenizer" containerID="cri-o://78f388f9d5a26bb6eb8f668d27dff6ab0cee7b2fe8abb84ed857db677e089aea" gracePeriod=30 Apr 16 17:04:08.780862 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:08.780834 2573 generic.go:358] "Generic (PLEG): container finished" podID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerID="78f388f9d5a26bb6eb8f668d27dff6ab0cee7b2fe8abb84ed857db677e089aea" exitCode=0 Apr 16 17:04:08.780862 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:08.780860 2573 generic.go:358] "Generic (PLEG): container finished" podID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerID="d37d37efc5985783649d4a7e52d87618f6f121342b45dcfff0afb2f4b08e01df" exitCode=0 Apr 16 17:04:08.781056 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:08.780904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerDied","Data":"78f388f9d5a26bb6eb8f668d27dff6ab0cee7b2fe8abb84ed857db677e089aea"} Apr 16 17:04:08.781056 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:08.780939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerDied","Data":"d37d37efc5985783649d4a7e52d87618f6f121342b45dcfff0afb2f4b08e01df"} Apr 16 17:04:08.899807 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:08.899783 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:04:09.062678 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062653 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.062826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062686 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.062826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062715 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.062826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062750 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.062826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062778 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkxsp\" (UniqueName: \"kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.062826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062824 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache\") pod \"6957063a-60e4-4d44-9b7e-97a59c1a183d\" (UID: \"6957063a-60e4-4d44-9b7e-97a59c1a183d\") " Apr 16 17:04:09.063081 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.062969 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:09.063139 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.063098 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.063190 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.063143 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:09.063190 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.063168 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:09.063540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.063519 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:09.064801 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.064776 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp" (OuterVolumeSpecName: "kube-api-access-zkxsp") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "kube-api-access-zkxsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:04:09.064894 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.064785 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6957063a-60e4-4d44-9b7e-97a59c1a183d" (UID: "6957063a-60e4-4d44-9b7e-97a59c1a183d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:04:09.164002 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.163982 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6957063a-60e4-4d44-9b7e-97a59c1a183d-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.164002 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.164002 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.164121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.164014 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.164121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.164024 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkxsp\" (UniqueName: \"kubernetes.io/projected/6957063a-60e4-4d44-9b7e-97a59c1a183d-kube-api-access-zkxsp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.164121 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.164035 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6957063a-60e4-4d44-9b7e-97a59c1a183d-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:04:09.785456 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.785431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" event={"ID":"6957063a-60e4-4d44-9b7e-97a59c1a183d","Type":"ContainerDied","Data":"25d1ffc6f3fe59ef3c4c8bd26dd9ea0427af571211e72d360a09de1f7fcaabe4"} Apr 16 17:04:09.785587 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.785467 2573 scope.go:117] "RemoveContainer" containerID="78f388f9d5a26bb6eb8f668d27dff6ab0cee7b2fe8abb84ed857db677e089aea" Apr 16 17:04:09.785587 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.785530 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x" Apr 16 17:04:09.793470 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.793450 2573 scope.go:117] "RemoveContainer" containerID="d37d37efc5985783649d4a7e52d87618f6f121342b45dcfff0afb2f4b08e01df" Apr 16 17:04:09.800231 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.800215 2573 scope.go:117] "RemoveContainer" containerID="54f0ca8037f6028d96d332a06f8d3d3fc070ae6fa6323e5d8f1887f53004f330" Apr 16 17:04:09.804373 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.804352 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:04:09.807950 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:09.807933 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-975d74c9dwv6x"] Apr 16 17:04:11.422597 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:11.422558 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" path="/var/lib/kubelet/pods/6957063a-60e4-4d44-9b7e-97a59c1a183d/volumes" Apr 16 17:04:18.220420 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.220390 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:04:18.220962 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.220925 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="main" Apr 16 17:04:18.220962 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.220953 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="main" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.220989 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="tokenizer" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.220998 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="tokenizer" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.221009 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="storage-initializer" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.221018 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="storage-initializer" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.221104 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="tokenizer" Apr 16 17:04:18.221158 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.221120 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6957063a-60e4-4d44-9b7e-97a59c1a183d" containerName="main" Apr 16 17:04:18.230447 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.230414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.233417 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.233369 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-cqff6\"" Apr 16 17:04:18.233522 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.233434 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:04:18.233522 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.233447 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 17:04:18.329722 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5r2\" (UniqueName: \"kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.329887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.329887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.329887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.329887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.330042 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.329924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430663 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5r2\" (UniqueName: \"kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.430833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.430778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.431088 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.431041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.431140 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.431096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.431178 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.431145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.431217 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.431205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.433327 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.433299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.439058 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.439037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5r2\" (UniqueName: \"kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2\") pod \"router-with-refs-test-kserve-router-scheduler-8657c58775-xld44\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.541684 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.541663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:18.665405 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.665360 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:04:18.667614 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:04:18.667589 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7409492_d7b6_4c4c_80d4_9188d29886a6.slice/crio-4818e628b4a07278d9dcf0bf20ab17e57324a78ea205705a8fbfb62fecf5f394 WatchSource:0}: Error finding container 4818e628b4a07278d9dcf0bf20ab17e57324a78ea205705a8fbfb62fecf5f394: Status 404 returned error can't find the container with id 4818e628b4a07278d9dcf0bf20ab17e57324a78ea205705a8fbfb62fecf5f394 Apr 16 17:04:18.669565 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.669550 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:04:18.818292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.818196 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerStarted","Data":"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917"} Apr 16 17:04:18.818292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:18.818232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerStarted","Data":"4818e628b4a07278d9dcf0bf20ab17e57324a78ea205705a8fbfb62fecf5f394"} Apr 16 17:04:19.822558 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:19.822525 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerID="34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917" exitCode=0 Apr 16 17:04:19.822918 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:19.822569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerDied","Data":"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917"} Apr 16 17:04:20.829610 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:20.829577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerStarted","Data":"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f"} Apr 16 17:04:20.829610 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:20.829610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerStarted","Data":"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85"} Apr 16 17:04:20.830035 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:20.829697 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:20.849838 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:20.849788 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" podStartSLOduration=2.849774229 podStartE2EDuration="2.849774229s" podCreationTimestamp="2026-04-16 17:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:04:20.847819823 +0000 UTC m=+985.988538580" watchObservedRunningTime="2026-04-16 17:04:20.849774229 +0000 UTC m=+985.990492970" Apr 16 17:04:23.765932 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:23.765901 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:04:28.542693 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:28.542658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:28.543131 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:28.542798 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:28.545237 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:28.545214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:28.860130 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:28.860042 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:04:50.866316 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:04:50.866284 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:05:43.183102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:43.183028 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:05:43.183547 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:43.183396 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="main" containerID="cri-o://5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9" gracePeriod=30 Apr 16 17:05:43.183547 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:43.183420 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="tokenizer" containerID="cri-o://776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4" gracePeriod=30 Apr 16 17:05:43.764663 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:05:43.764624 2573 logging.go:55] [core] [Channel #387 SubChannel #388]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.36:9003", ServerName: "10.132.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.36:9003: connect: connection refused" Apr 16 17:05:44.107297 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.107270 2573 generic.go:358] "Generic (PLEG): container finished" podID="50631686-fdca-4bcb-90ad-e0445b733f21" containerID="5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9" exitCode=0 Apr 16 17:05:44.107482 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.107334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerDied","Data":"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9"} Apr 16 17:05:44.245391 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.245357 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:05:44.401951 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.401886 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.401951 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.401915 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.401951 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.401935 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.402122 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402074 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.402122 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402111 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.402224 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402130 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lzdj\" (UniqueName: \"kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj\") pod \"50631686-fdca-4bcb-90ad-e0445b733f21\" (UID: \"50631686-fdca-4bcb-90ad-e0445b733f21\") " Apr 16 17:05:44.402224 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402186 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:05:44.402435 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402407 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:05:44.402528 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402448 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:05:44.402528 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402475 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.402686 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.402668 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:05:44.404012 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.403988 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:05:44.404345 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.404323 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj" (OuterVolumeSpecName: "kube-api-access-9lzdj") pod "50631686-fdca-4bcb-90ad-e0445b733f21" (UID: "50631686-fdca-4bcb-90ad-e0445b733f21"). InnerVolumeSpecName "kube-api-access-9lzdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:05:44.502991 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.502961 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.502991 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.502986 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9lzdj\" (UniqueName: \"kubernetes.io/projected/50631686-fdca-4bcb-90ad-e0445b733f21-kube-api-access-9lzdj\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.502991 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.502997 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.503154 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.503006 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50631686-fdca-4bcb-90ad-e0445b733f21-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.503154 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.503017 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/50631686-fdca-4bcb-90ad-e0445b733f21-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:05:44.764968 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:44.764903 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.36:9003\" within 1s: context deadline exceeded" Apr 16 17:05:45.112482 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.112443 2573 generic.go:358] "Generic (PLEG): container finished" podID="50631686-fdca-4bcb-90ad-e0445b733f21" containerID="776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4" exitCode=0 Apr 16 17:05:45.112636 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.112476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerDied","Data":"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4"} Apr 16 17:05:45.112636 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.112526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" event={"ID":"50631686-fdca-4bcb-90ad-e0445b733f21","Type":"ContainerDied","Data":"f2a0a2064b9cdf8d4ecfdbbbfd70b77606b20f7141246eae06fd721e9c9c2521"} Apr 16 17:05:45.112636 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.112545 2573 scope.go:117] "RemoveContainer" containerID="776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4" Apr 16 17:05:45.112636 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.112561 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg" Apr 16 17:05:45.121808 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.121787 2573 scope.go:117] "RemoveContainer" containerID="5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9" Apr 16 17:05:45.129513 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.129494 2573 scope.go:117] "RemoveContainer" containerID="94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98" Apr 16 17:05:45.134503 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.134479 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:05:45.138467 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.138433 2573 scope.go:117] "RemoveContainer" containerID="776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4" Apr 16 17:05:45.138770 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:05:45.138750 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4\": container with ID starting with 776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4 not found: ID does not exist" containerID="776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4" Apr 16 17:05:45.138855 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.138780 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4"} err="failed to get container status \"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4\": rpc error: code = NotFound desc = could not find container \"776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4\": container with ID starting with 776acc62656e1c7618074088a633f6e620b21f0a733eb9cd797280817b2753c4 not found: ID does not exist" Apr 16 17:05:45.138855 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.138805 2573 scope.go:117] "RemoveContainer" containerID="5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9" Apr 16 17:05:45.138855 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.138830 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7db4d9c6b5-jdgbg"] Apr 16 17:05:45.139127 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:05:45.139109 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9\": container with ID starting with 5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9 not found: ID does not exist" containerID="5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9" Apr 16 17:05:45.139174 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.139133 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9"} err="failed to get container status \"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9\": rpc error: code = NotFound desc = could not find container \"5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9\": container with ID starting with 5c83c4f76f679b5dc339f6d01a5e8dfea05a5bd6fe15404299937c99bddd49d9 not found: ID does not exist" Apr 16 17:05:45.139174 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.139151 2573 scope.go:117] "RemoveContainer" containerID="94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98" Apr 16 17:05:45.139544 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:05:45.139523 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98\": container with ID starting with 94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98 not found: ID does not exist" containerID="94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98" Apr 16 17:05:45.139617 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.139553 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98"} err="failed to get container status \"94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98\": rpc error: code = NotFound desc = could not find container \"94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98\": container with ID starting with 94fa8ff59a8041b1ddd7bf31b2cdc982806e3477f4ca8fe82945f14365a87e98 not found: ID does not exist" Apr 16 17:05:45.420229 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:05:45.420173 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" path="/var/lib/kubelet/pods/50631686-fdca-4bcb-90ad-e0445b733f21/volumes" Apr 16 17:06:14.488955 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:14.488923 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:06:14.489568 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:14.489234 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="main" containerID="cri-o://614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85" gracePeriod=30 Apr 16 17:06:14.489568 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:14.489280 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="tokenizer" containerID="cri-o://684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f" gracePeriod=30 Apr 16 17:06:15.216882 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.216852 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerID="614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85" exitCode=0 Apr 16 17:06:15.217057 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.216906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerDied","Data":"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85"} Apr 16 17:06:15.633814 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.633794 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:06:15.728053 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728034 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728180 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728067 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5r2\" (UniqueName: \"kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728180 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728180 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728126 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728180 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728165 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728412 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728193 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds\") pod \"a7409492-d7b6-4c4c-80d4-9188d29886a6\" (UID: \"a7409492-d7b6-4c4c-80d4-9188d29886a6\") " Apr 16 17:06:15.728412 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728329 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:15.728536 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728506 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:15.728581 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728519 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:15.728581 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728526 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:15.728888 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.728869 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:06:15.730146 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.730120 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2" (OuterVolumeSpecName: "kube-api-access-jv5r2") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "kube-api-access-jv5r2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:06:15.730239 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.730173 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a7409492-d7b6-4c4c-80d4-9188d29886a6" (UID: "a7409492-d7b6-4c4c-80d4-9188d29886a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:06:15.829075 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.829052 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jv5r2\" (UniqueName: \"kubernetes.io/projected/a7409492-d7b6-4c4c-80d4-9188d29886a6-kube-api-access-jv5r2\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:15.829075 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.829074 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:15.829205 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.829084 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7409492-d7b6-4c4c-80d4-9188d29886a6-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:15.829205 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.829092 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:15.829205 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:15.829101 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7409492-d7b6-4c4c-80d4-9188d29886a6-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:06:16.222622 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.222548 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerID="684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f" exitCode=0 Apr 16 17:06:16.222750 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.222618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerDied","Data":"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f"} Apr 16 17:06:16.222750 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.222659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" event={"ID":"a7409492-d7b6-4c4c-80d4-9188d29886a6","Type":"ContainerDied","Data":"4818e628b4a07278d9dcf0bf20ab17e57324a78ea205705a8fbfb62fecf5f394"} Apr 16 17:06:16.222750 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.222659 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44" Apr 16 17:06:16.222750 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.222674 2573 scope.go:117] "RemoveContainer" containerID="684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f" Apr 16 17:06:16.232329 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.232306 2573 scope.go:117] "RemoveContainer" containerID="614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85" Apr 16 17:06:16.239961 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.239938 2573 scope.go:117] "RemoveContainer" containerID="34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917" Apr 16 17:06:16.247130 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.247112 2573 scope.go:117] "RemoveContainer" containerID="684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f" Apr 16 17:06:16.247454 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:06:16.247431 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f\": container with ID starting with 684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f not found: ID does not exist" containerID="684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f" Apr 16 17:06:16.247540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.247463 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f"} err="failed to get container status \"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f\": rpc error: code = NotFound desc = could not find container \"684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f\": container with ID starting with 684ce83625a852830c42deb22b0a0faef66d9252dd3f518316a291df1e70685f not found: ID does not exist" Apr 16 17:06:16.247540 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.247484 2573 scope.go:117] "RemoveContainer" containerID="614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85" Apr 16 17:06:16.247762 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:06:16.247738 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85\": container with ID starting with 614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85 not found: ID does not exist" containerID="614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85" Apr 16 17:06:16.247809 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.247767 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85"} err="failed to get container status \"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85\": rpc error: code = NotFound desc = could not find container \"614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85\": container with ID starting with 614dac5bd0f4937ee74f4173a17a98416f27e42c2e74648ddab127d5a80f8b85 not found: ID does not exist" Apr 16 17:06:16.247809 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.247783 2573 scope.go:117] "RemoveContainer" containerID="34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917" Apr 16 17:06:16.248050 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:06:16.248031 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917\": container with ID starting with 34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917 not found: ID does not exist" containerID="34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917" Apr 16 17:06:16.248094 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.248058 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917"} err="failed to get container status \"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917\": rpc error: code = NotFound desc = could not find container \"34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917\": container with ID starting with 34be1d7dc1543f3f23b1b7a4056c1745c6f63a09a7d208ed9c47524ef6d2a917 not found: ID does not exist" Apr 16 17:06:16.261869 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.261848 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:06:16.265854 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:16.265833 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8657c58775-xld44"] Apr 16 17:06:17.421713 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:17.421676 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" path="/var/lib/kubelet/pods/a7409492-d7b6-4c4c-80d4-9188d29886a6/volumes" Apr 16 17:06:47.955128 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955043 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955532 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955554 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955574 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955583 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955596 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="storage-initializer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955606 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="storage-initializer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955631 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="tokenizer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955639 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="tokenizer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955660 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="storage-initializer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955669 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="storage-initializer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955692 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="tokenizer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955700 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="tokenizer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955792 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955807 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="main" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955819 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="50631686-fdca-4bcb-90ad-e0445b733f21" containerName="tokenizer" Apr 16 17:06:47.955822 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.955830 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7409492-d7b6-4c4c-80d4-9188d29886a6" containerName="tokenizer" Apr 16 17:06:47.959181 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.959153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:47.962928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.962904 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 17:06:47.962928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.962922 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 17:06:47.963096 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.962922 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-j4cb7\"" Apr 16 17:06:47.969264 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:47.969232 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:06:48.043893 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.043867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.044009 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.043896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.044009 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.043916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4pb\" (UniqueName: \"kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.044009 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.043971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.044009 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.043991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.044144 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.044064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144356 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144471 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144471 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144471 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144618 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144618 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4pb\" (UniqueName: \"kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144898 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144874 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144978 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.144978 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.144957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.145083 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.145066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.146857 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.146838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.152815 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.152790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4pb\" (UniqueName: \"kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.270177 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.270109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:48.392156 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:48.392117 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:06:48.394921 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:06:48.394889 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7d1c5fa_44a4_41c4_8146_6136151de172.slice/crio-27d49d53536d96f98e3773b414ed125e7b413c240b1e8146d9c4ed33491de566 WatchSource:0}: Error finding container 27d49d53536d96f98e3773b414ed125e7b413c240b1e8146d9c4ed33491de566: Status 404 returned error can't find the container with id 27d49d53536d96f98e3773b414ed125e7b413c240b1e8146d9c4ed33491de566 Apr 16 17:06:49.327941 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:49.327905 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerID="358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28" exitCode=0 Apr 16 17:06:49.328237 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:49.327997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerDied","Data":"358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28"} Apr 16 17:06:49.328237 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:49.328039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerStarted","Data":"27d49d53536d96f98e3773b414ed125e7b413c240b1e8146d9c4ed33491de566"} Apr 16 17:06:50.333640 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:50.333608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerStarted","Data":"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10"} Apr 16 17:06:50.333640 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:50.333641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerStarted","Data":"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3"} Apr 16 17:06:50.334106 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:50.333876 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:50.355277 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:50.355231 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" podStartSLOduration=3.355221287 podStartE2EDuration="3.355221287s" podCreationTimestamp="2026-04-16 17:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:06:50.353479917 +0000 UTC m=+1135.494198660" watchObservedRunningTime="2026-04-16 17:06:50.355221287 +0000 UTC m=+1135.495940030" Apr 16 17:06:58.270754 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:58.270710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:58.270754 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:58.270764 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:58.273451 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:58.273429 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:06:58.363495 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:06:58.363469 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:07:19.367947 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:07:19.367917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:07:55.396995 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:07:55.396970 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:07:55.399517 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:07:55.399494 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:09:51.578843 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:51.578811 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:09:51.579341 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:51.579185 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="main" containerID="cri-o://a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3" gracePeriod=30 Apr 16 17:09:51.579341 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:51.579293 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="tokenizer" containerID="cri-o://77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10" gracePeriod=30 Apr 16 17:09:51.874768 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:51.874689 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerID="a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3" exitCode=0 Apr 16 17:09:51.874768 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:51.874726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerDied","Data":"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3"} Apr 16 17:09:52.642625 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.642603 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:09:52.751723 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751655 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.751723 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751711 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.751928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751736 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.751928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751765 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.751928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751870 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.751928 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.751909 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz4pb\" (UniqueName: \"kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb\") pod \"e7d1c5fa-44a4-41c4-8146-6136151de172\" (UID: \"e7d1c5fa-44a4-41c4-8146-6136151de172\") " Apr 16 17:09:52.752139 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.752028 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:09:52.752139 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.752116 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:09:52.752242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.752163 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:09:52.752242 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.752183 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.752538 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.752508 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:09:52.753826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.753802 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:09:52.753926 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.753905 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb" (OuterVolumeSpecName: "kube-api-access-cz4pb") pod "e7d1c5fa-44a4-41c4-8146-6136151de172" (UID: "e7d1c5fa-44a4-41c4-8146-6136151de172"). InnerVolumeSpecName "kube-api-access-cz4pb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:09:52.853113 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.853085 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cz4pb\" (UniqueName: \"kubernetes.io/projected/e7d1c5fa-44a4-41c4-8146-6136151de172-kube-api-access-cz4pb\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.853113 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.853112 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.853253 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.853128 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d1c5fa-44a4-41c4-8146-6136151de172-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.853253 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.853142 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.853253 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.853157 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e7d1c5fa-44a4-41c4-8146-6136151de172-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:09:52.879571 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.879546 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerID="77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10" exitCode=0 Apr 16 17:09:52.879661 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.879575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerDied","Data":"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10"} Apr 16 17:09:52.879661 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.879598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" event={"ID":"e7d1c5fa-44a4-41c4-8146-6136151de172","Type":"ContainerDied","Data":"27d49d53536d96f98e3773b414ed125e7b413c240b1e8146d9c4ed33491de566"} Apr 16 17:09:52.879661 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.879612 2573 scope.go:117] "RemoveContainer" containerID="77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10" Apr 16 17:09:52.879661 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.879624 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj" Apr 16 17:09:52.889986 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.889968 2573 scope.go:117] "RemoveContainer" containerID="a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3" Apr 16 17:09:52.896981 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.896962 2573 scope.go:117] "RemoveContainer" containerID="358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28" Apr 16 17:09:52.903043 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.903019 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:09:52.904056 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904038 2573 scope.go:117] "RemoveContainer" containerID="77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10" Apr 16 17:09:52.904339 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:09:52.904317 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10\": container with ID starting with 77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10 not found: ID does not exist" containerID="77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10" Apr 16 17:09:52.904444 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904349 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10"} err="failed to get container status \"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10\": rpc error: code = NotFound desc = could not find container \"77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10\": container with ID starting with 77cfdd1605c1bb76e9894ae1fef17ff07be34804aa65be211f3ee905abbf7c10 not found: ID does not exist" Apr 16 17:09:52.904444 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904375 2573 scope.go:117] "RemoveContainer" containerID="a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3" Apr 16 17:09:52.904664 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:09:52.904642 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3\": container with ID starting with a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3 not found: ID does not exist" containerID="a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3" Apr 16 17:09:52.904706 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904674 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3"} err="failed to get container status \"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3\": rpc error: code = NotFound desc = could not find container \"a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3\": container with ID starting with a480c3634f8e66a16fcb9b023fd4689e36133a551e05f213c028c7b10cd286b3 not found: ID does not exist" Apr 16 17:09:52.904706 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904695 2573 scope.go:117] "RemoveContainer" containerID="358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28" Apr 16 17:09:52.904934 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:09:52.904914 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28\": container with ID starting with 358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28 not found: ID does not exist" containerID="358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28" Apr 16 17:09:52.904978 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.904942 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28"} err="failed to get container status \"358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28\": rpc error: code = NotFound desc = could not find container \"358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28\": container with ID starting with 358fe60dc263dcd336b82aa49942b098ea53bbf123cbcf1c8372818516b7ab28 not found: ID does not exist" Apr 16 17:09:52.908138 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:52.908117 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schew2pwj"] Apr 16 17:09:53.421760 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:09:53.421732 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" path="/var/lib/kubelet/pods/e7d1c5fa-44a4-41c4-8146-6136151de172/volumes" Apr 16 17:10:20.013930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.013902 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014175 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="tokenizer" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014185 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="tokenizer" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014208 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="storage-initializer" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014214 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="storage-initializer" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014221 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="main" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014229 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="main" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014283 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="main" Apr 16 17:10:20.014292 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.014293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7d1c5fa-44a4-41c4-8146-6136151de172" containerName="tokenizer" Apr 16 17:10:20.017087 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.017071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.020097 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.020075 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-7962q\"" Apr 16 17:10:20.020216 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.020142 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 17:10:20.021294 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.021275 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 17:10:20.027880 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.027857 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:10:20.154874 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.154848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq74n\" (UniqueName: \"kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.155038 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.154885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.155038 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.154939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.155038 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.154984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.155198 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.155065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.155198 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.155110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255587 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255740 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255740 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq74n\" (UniqueName: \"kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255740 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255740 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.255952 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.255754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.256044 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.256020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.256104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.256060 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.256104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.256091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.256172 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.256110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.258078 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.258059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.264271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.264200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq74n\" (UniqueName: \"kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.326142 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.326118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:20.448603 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.448577 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:10:20.451175 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:10:20.451147 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3939f0_ce74_4722_9272_78d567def0c5.slice/crio-df0822d0798f06b30235688974490d4c8608771c26305c82f98b54c475c2ab29 WatchSource:0}: Error finding container df0822d0798f06b30235688974490d4c8608771c26305c82f98b54c475c2ab29: Status 404 returned error can't find the container with id df0822d0798f06b30235688974490d4c8608771c26305c82f98b54c475c2ab29 Apr 16 17:10:20.453028 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.453011 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:10:20.970310 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.970277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerStarted","Data":"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9"} Apr 16 17:10:20.970310 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:20.970313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerStarted","Data":"df0822d0798f06b30235688974490d4c8608771c26305c82f98b54c475c2ab29"} Apr 16 17:10:21.974431 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:21.974400 2573 generic.go:358] "Generic (PLEG): container finished" podID="0a3939f0-ce74-4722-9272-78d567def0c5" containerID="efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9" exitCode=0 Apr 16 17:10:21.974799 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:21.974480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerDied","Data":"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9"} Apr 16 17:10:22.985946 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:22.985901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerStarted","Data":"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9"} Apr 16 17:10:22.985946 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:22.985944 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerStarted","Data":"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953"} Apr 16 17:10:22.986367 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:22.986077 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:23.009308 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:23.009273 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" podStartSLOduration=4.009261316 podStartE2EDuration="4.009261316s" podCreationTimestamp="2026-04-16 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:10:23.007583183 +0000 UTC m=+1348.148301925" watchObservedRunningTime="2026-04-16 17:10:23.009261316 +0000 UTC m=+1348.149980057" Apr 16 17:10:30.327045 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:30.326993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:30.327045 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:30.327057 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:30.330239 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:30.330213 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:31.011704 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:31.011670 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:10:35.307140 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.307108 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:10:35.310898 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.310881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.313487 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.313466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 17:10:35.313487 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.313480 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-rv5zk\"" Apr 16 17:10:35.319030 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.319009 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:10:35.388595 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.388569 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:10:35.392647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.392626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.395234 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.395215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-cf5g8\"" Apr 16 17:10:35.402917 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.402893 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:10:35.468933 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.468906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.469058 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.468950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.469058 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.468980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgp9p\" (UniqueName: \"kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.469183 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.469059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.469183 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.469092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.469282 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.469199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.570568 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.570568 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.570568 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.570835 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv244\" (UniqueName: \"kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.570835 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.570835 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.570835 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.570835 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.571104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.571104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.571104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgp9p\" (UniqueName: \"kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.571104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.570977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.571104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.571059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.571453 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.571104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.571453 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.571312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.572922 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.572901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.573160 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.573144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.578655 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.578637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgp9p\" (UniqueName: \"kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.623820 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.623800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:10:35.672011 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.671524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.672236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.671976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.672236 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.672200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.672674 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.672617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.673070 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.673137 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.673202 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv244\" (UniqueName: \"kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.673636 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.673728 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.674019 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.673996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.676643 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.676618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.682029 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.682006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv244\" (UniqueName: \"kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.705126 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.705102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:35.753403 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.753351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:10:35.757170 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:10:35.757135 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9974a58f_765a_41bc_bebc_aafb5c7e387a.slice/crio-19753a648364d2538758b361f601cb0bad1a4d29f8f0c2582ef1e52ef8204885 WatchSource:0}: Error finding container 19753a648364d2538758b361f601cb0bad1a4d29f8f0c2582ef1e52ef8204885: Status 404 returned error can't find the container with id 19753a648364d2538758b361f601cb0bad1a4d29f8f0c2582ef1e52ef8204885 Apr 16 17:10:35.834148 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:35.834125 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:10:35.835943 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:10:35.835916 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5535f7d5_c91c_45d4_bd9d_4a6e3fe3a8f2.slice/crio-8fcfb31f10c45336aa7a659c4d29bb88b9b6a1c4d3ba79208d865559d74a1b66 WatchSource:0}: Error finding container 8fcfb31f10c45336aa7a659c4d29bb88b9b6a1c4d3ba79208d865559d74a1b66: Status 404 returned error can't find the container with id 8fcfb31f10c45336aa7a659c4d29bb88b9b6a1c4d3ba79208d865559d74a1b66 Apr 16 17:10:36.027916 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:36.027866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerStarted","Data":"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c"} Apr 16 17:10:36.028108 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:36.027925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerStarted","Data":"8fcfb31f10c45336aa7a659c4d29bb88b9b6a1c4d3ba79208d865559d74a1b66"} Apr 16 17:10:36.029582 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:36.029525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerStarted","Data":"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d"} Apr 16 17:10:36.029582 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:36.029559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerStarted","Data":"19753a648364d2538758b361f601cb0bad1a4d29f8f0c2582ef1e52ef8204885"} Apr 16 17:10:37.034678 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:37.034643 2573 generic.go:358] "Generic (PLEG): container finished" podID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerID="8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c" exitCode=0 Apr 16 17:10:37.035153 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:37.034731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerDied","Data":"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c"} Apr 16 17:10:38.040788 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:38.040748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerStarted","Data":"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45"} Apr 16 17:10:38.040788 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:38.040790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerStarted","Data":"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c"} Apr 16 17:10:38.041336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:38.040914 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:38.063869 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:38.063808 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" podStartSLOduration=3.06379271 podStartE2EDuration="3.06379271s" podCreationTimestamp="2026-04-16 17:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:10:38.061366246 +0000 UTC m=+1363.202084989" watchObservedRunningTime="2026-04-16 17:10:38.06379271 +0000 UTC m=+1363.204511452" Apr 16 17:10:40.048974 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:40.048946 2573 generic.go:358] "Generic (PLEG): container finished" podID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerID="47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d" exitCode=0 Apr 16 17:10:40.049337 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:40.048989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerDied","Data":"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d"} Apr 16 17:10:45.705271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:45.705234 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:45.705752 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:45.705281 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:45.708497 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:45.708473 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:46.074802 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:46.074779 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:10:52.014490 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:10:52.014466 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:11:07.078116 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:11:07.078088 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:11:24.208811 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:11:24.208779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerStarted","Data":"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7"} Apr 16 17:11:24.227602 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:11:24.227547 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=5.295811341 podStartE2EDuration="49.227528471s" podCreationTimestamp="2026-04-16 17:10:35 +0000 UTC" firstStartedPulling="2026-04-16 17:10:40.050171856 +0000 UTC m=+1365.190890580" lastFinishedPulling="2026-04-16 17:11:23.981888982 +0000 UTC m=+1409.122607710" observedRunningTime="2026-04-16 17:11:24.225367851 +0000 UTC m=+1409.366086592" watchObservedRunningTime="2026-04-16 17:11:24.227528471 +0000 UTC m=+1409.368247220" Apr 16 17:12:55.428995 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:12:55.428967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:12:55.435030 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:12:55.435007 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:13:24.887212 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:24.887175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:13:24.887784 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:24.887493 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="main" containerID="cri-o://24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953" gracePeriod=30 Apr 16 17:13:24.887784 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:24.887533 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="tokenizer" containerID="cri-o://1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9" gracePeriod=30 Apr 16 17:13:25.609454 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:25.609421 2573 generic.go:358] "Generic (PLEG): container finished" podID="0a3939f0-ce74-4722-9272-78d567def0c5" containerID="24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953" exitCode=0 Apr 16 17:13:25.609632 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:25.609485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerDied","Data":"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953"} Apr 16 17:13:26.026056 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.026027 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:13:26.155500 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155439 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq74n\" (UniqueName: \"kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155503 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155542 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155572 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155598 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155620 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location\") pod \"0a3939f0-ce74-4722-9272-78d567def0c5\" (UID: \"0a3939f0-ce74-4722-9272-78d567def0c5\") " Apr 16 17:13:26.155931 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155867 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:26.155931 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155891 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:26.156030 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.155981 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:26.156438 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.156418 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:26.157706 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.157684 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:26.157799 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.157743 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n" (OuterVolumeSpecName: "kube-api-access-xq74n") pod "0a3939f0-ce74-4722-9272-78d567def0c5" (UID: "0a3939f0-ce74-4722-9272-78d567def0c5"). InnerVolumeSpecName "kube-api-access-xq74n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:26.256198 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256173 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.256198 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256196 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.256336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256205 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3939f0-ce74-4722-9272-78d567def0c5-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.256336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256217 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.256336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256226 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a3939f0-ce74-4722-9272-78d567def0c5-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.256336 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.256235 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xq74n\" (UniqueName: \"kubernetes.io/projected/0a3939f0-ce74-4722-9272-78d567def0c5-kube-api-access-xq74n\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:26.614980 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.614945 2573 generic.go:358] "Generic (PLEG): container finished" podID="0a3939f0-ce74-4722-9272-78d567def0c5" containerID="1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9" exitCode=0 Apr 16 17:13:26.614980 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.614983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerDied","Data":"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9"} Apr 16 17:13:26.615220 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.615010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" event={"ID":"0a3939f0-ce74-4722-9272-78d567def0c5","Type":"ContainerDied","Data":"df0822d0798f06b30235688974490d4c8608771c26305c82f98b54c475c2ab29"} Apr 16 17:13:26.615220 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.615018 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc" Apr 16 17:13:26.615220 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.615028 2573 scope.go:117] "RemoveContainer" containerID="1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9" Apr 16 17:13:26.624854 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.624614 2573 scope.go:117] "RemoveContainer" containerID="24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953" Apr 16 17:13:26.632006 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.631993 2573 scope.go:117] "RemoveContainer" containerID="efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9" Apr 16 17:13:26.638831 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.638814 2573 scope.go:117] "RemoveContainer" containerID="1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9" Apr 16 17:13:26.639093 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:26.639072 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9\": container with ID starting with 1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9 not found: ID does not exist" containerID="1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9" Apr 16 17:13:26.639141 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.639105 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9"} err="failed to get container status \"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9\": rpc error: code = NotFound desc = could not find container \"1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9\": container with ID starting with 1bc9deebfecc4f308d8b567ad89ee7984f2354350fbd8f1e835d6fba234f45b9 not found: ID does not exist" Apr 16 17:13:26.639141 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.639123 2573 scope.go:117] "RemoveContainer" containerID="24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953" Apr 16 17:13:26.639340 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:26.639328 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953\": container with ID starting with 24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953 not found: ID does not exist" containerID="24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953" Apr 16 17:13:26.639391 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.639344 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953"} err="failed to get container status \"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953\": rpc error: code = NotFound desc = could not find container \"24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953\": container with ID starting with 24c2e567889718daa56104f0b27cc84c87206f053ce19325565b055eebb4c953 not found: ID does not exist" Apr 16 17:13:26.639391 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.639359 2573 scope.go:117] "RemoveContainer" containerID="efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9" Apr 16 17:13:26.639608 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:26.639594 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9\": container with ID starting with efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9 not found: ID does not exist" containerID="efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9" Apr 16 17:13:26.639645 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.639613 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9"} err="failed to get container status \"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9\": rpc error: code = NotFound desc = could not find container \"efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9\": container with ID starting with efff05ecfc9b0e89370abd6c2bea2bffc91287070b2ef9eb92fc09eb8a6c85f9 not found: ID does not exist" Apr 16 17:13:26.646165 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.646145 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:13:26.649256 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:26.649236 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-57c68s66lc"] Apr 16 17:13:27.422442 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:27.422402 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" path="/var/lib/kubelet/pods/0a3939f0-ce74-4722-9272-78d567def0c5/volumes" Apr 16 17:13:32.907211 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907171 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907531 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="main" Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907545 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="main" Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907553 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="tokenizer" Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907559 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="tokenizer" Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907571 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="storage-initializer" Apr 16 17:13:32.907642 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907578 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="storage-initializer" Apr 16 17:13:32.907867 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907652 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="main" Apr 16 17:13:32.907867 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.907662 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a3939f0-ce74-4722-9272-78d567def0c5" containerName="tokenizer" Apr 16 17:13:32.912660 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.912638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:32.915629 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.915599 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-xwsbr\"" Apr 16 17:13:32.915629 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.915601 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 17:13:32.919664 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:32.919640 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:13:33.007595 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.007708 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.007708 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007694 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.007795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzcd\" (UniqueName: \"kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.007795 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.007862 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.007841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.108887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.108859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109006 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.108905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109006 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.108932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109006 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.108958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109152 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.109018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109152 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.109044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzcd\" (UniqueName: \"kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109265 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.109244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109332 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.109304 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.109481 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.109461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.111200 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.111182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.111475 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.111459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.117102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.117079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzcd\" (UniqueName: \"kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd\") pod \"router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.225341 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.225286 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:33.350811 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.350786 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:13:33.352777 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:13:33.352748 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bdb323_afb3_490d_a32e_0f2b04579c86.slice/crio-fcf93a3ef39446f8053c5b0ff4da7b20595698cb89fc5363aa10284fcab6f1e9 WatchSource:0}: Error finding container fcf93a3ef39446f8053c5b0ff4da7b20595698cb89fc5363aa10284fcab6f1e9: Status 404 returned error can't find the container with id fcf93a3ef39446f8053c5b0ff4da7b20595698cb89fc5363aa10284fcab6f1e9 Apr 16 17:13:33.642217 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.642187 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerStarted","Data":"fcf93a3ef39446f8053c5b0ff4da7b20595698cb89fc5363aa10284fcab6f1e9"} Apr 16 17:13:33.645652 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.645628 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:13:33.649528 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.649510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.652099 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.652080 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-g7xp4\"" Apr 16 17:13:33.659328 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.659298 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:13:33.816954 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.816921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.817123 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.816975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.817123 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.817053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.817123 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.817097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwn87\" (UniqueName: \"kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.817322 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.817128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.817322 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.817242 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.918592 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.918592 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwn87\" (UniqueName: \"kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919049 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919049 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919049 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919049 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.918779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919245 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.919186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919245 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.919228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919328 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.919276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.919566 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.919547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.921391 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.921355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.926308 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.926285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwn87\" (UniqueName: \"kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:33.961239 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:33.961212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:34.196036 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:34.196013 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:13:34.197915 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:13:34.197885 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d0a094_07eb_454d_a708_b1897293a5a4.slice/crio-7d5f42cb7f3f193a98f8820c8971473880f1fbbbfc4120d574ea2586833471f3 WatchSource:0}: Error finding container 7d5f42cb7f3f193a98f8820c8971473880f1fbbbfc4120d574ea2586833471f3: Status 404 returned error can't find the container with id 7d5f42cb7f3f193a98f8820c8971473880f1fbbbfc4120d574ea2586833471f3 Apr 16 17:13:34.646393 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:34.646358 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerStarted","Data":"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364"} Apr 16 17:13:34.646575 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:34.646414 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerStarted","Data":"7d5f42cb7f3f193a98f8820c8971473880f1fbbbfc4120d574ea2586833471f3"} Apr 16 17:13:34.647733 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:34.647704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerStarted","Data":"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51"} Apr 16 17:13:34.647865 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:34.647806 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:35.651855 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:35.651821 2573 generic.go:358] "Generic (PLEG): container finished" podID="42d0a094-07eb-454d-a708-b1897293a5a4" containerID="c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364" exitCode=0 Apr 16 17:13:35.652312 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:35.651906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerDied","Data":"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364"} Apr 16 17:13:35.653726 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:35.653703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerStarted","Data":"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5"} Apr 16 17:13:36.659414 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:36.659354 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerStarted","Data":"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07"} Apr 16 17:13:36.659897 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:36.659420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerStarted","Data":"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df"} Apr 16 17:13:36.659897 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:36.659473 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:36.681239 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:36.681192 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" podStartSLOduration=3.681176706 podStartE2EDuration="3.681176706s" podCreationTimestamp="2026-04-16 17:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:13:36.679654122 +0000 UTC m=+1541.820372866" watchObservedRunningTime="2026-04-16 17:13:36.681176706 +0000 UTC m=+1541.821895447" Apr 16 17:13:39.672118 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:39.672076 2573 generic.go:358] "Generic (PLEG): container finished" podID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerID="4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5" exitCode=0 Apr 16 17:13:39.672604 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:39.672149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerDied","Data":"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5"} Apr 16 17:13:40.677065 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:40.677029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerStarted","Data":"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f"} Apr 16 17:13:40.700518 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:40.700461 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podStartSLOduration=7.933369645 podStartE2EDuration="8.700443278s" podCreationTimestamp="2026-04-16 17:13:32 +0000 UTC" firstStartedPulling="2026-04-16 17:13:33.354646315 +0000 UTC m=+1538.495365035" lastFinishedPulling="2026-04-16 17:13:34.121719934 +0000 UTC m=+1539.262438668" observedRunningTime="2026-04-16 17:13:40.697164311 +0000 UTC m=+1545.837883054" watchObservedRunningTime="2026-04-16 17:13:40.700443278 +0000 UTC m=+1545.841162020" Apr 16 17:13:43.226055 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.226011 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:43.226055 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.226061 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:13:43.227547 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.227512 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:13:43.962013 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.961978 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:43.962183 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.962026 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:43.964971 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:43.964945 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:44.692910 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:44.692878 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:13:46.007129 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.007094 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:13:46.007530 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.007421 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="main" containerID="cri-o://ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c" gracePeriod=30 Apr 16 17:13:46.007630 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.007607 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="tokenizer" containerID="cri-o://42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45" gracePeriod=30 Apr 16 17:13:46.074802 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.074701 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.41:8082/healthz\": dial tcp 10.132.0.41:8082: connect: connection refused" Apr 16 17:13:46.699723 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.699675 2573 generic.go:358] "Generic (PLEG): container finished" podID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerID="ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c" exitCode=0 Apr 16 17:13:46.699887 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:46.699724 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerDied","Data":"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c"} Apr 16 17:13:47.077340 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:13:47.077314 2573 logging.go:55] [core] [Channel #719 SubChannel #720]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.41:9003", ServerName: "10.132.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.41:9003: connect: connection refused" Apr 16 17:13:47.177814 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.177793 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:13:47.239701 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239630 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.239701 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239665 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.239701 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239687 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.239958 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239730 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.239958 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239769 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.239958 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239786 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv244\" (UniqueName: \"kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244\") pod \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\" (UID: \"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2\") " Apr 16 17:13:47.240100 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.239958 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.240100 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.240078 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.240100 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.240080 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.240271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.240250 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.240749 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.240717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:47.241875 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.241845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:47.242043 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.242022 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244" (OuterVolumeSpecName: "kube-api-access-rv244") pod "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" (UID: "5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2"). InnerVolumeSpecName "kube-api-access-rv244". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:47.341123 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.341100 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.341123 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.341123 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.341257 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.341132 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rv244\" (UniqueName: \"kubernetes.io/projected/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kube-api-access-rv244\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.341257 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.341145 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.341257 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.341153 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:47.375180 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.375130 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:13:47.375435 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.375413 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="main" containerID="cri-o://8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7" gracePeriod=30 Apr 16 17:13:47.704361 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.704323 2573 generic.go:358] "Generic (PLEG): container finished" podID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerID="42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45" exitCode=0 Apr 16 17:13:47.704563 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.704405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerDied","Data":"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45"} Apr 16 17:13:47.704563 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.704447 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" Apr 16 17:13:47.704563 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.704459 2573 scope.go:117] "RemoveContainer" containerID="42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45" Apr 16 17:13:47.704714 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.704448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" event={"ID":"5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2","Type":"ContainerDied","Data":"8fcfb31f10c45336aa7a659c4d29bb88b9b6a1c4d3ba79208d865559d74a1b66"} Apr 16 17:13:47.713078 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.713049 2573 scope.go:117] "RemoveContainer" containerID="ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c" Apr 16 17:13:47.720151 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.720120 2573 scope.go:117] "RemoveContainer" containerID="8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c" Apr 16 17:13:47.723198 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.723175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:13:47.728519 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.728490 2573 scope.go:117] "RemoveContainer" containerID="42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45" Apr 16 17:13:47.728665 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.728647 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf"] Apr 16 17:13:47.728820 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:47.728796 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45\": container with ID starting with 42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45 not found: ID does not exist" containerID="42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45" Apr 16 17:13:47.728912 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.728826 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45"} err="failed to get container status \"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45\": rpc error: code = NotFound desc = could not find container \"42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45\": container with ID starting with 42ea3fec58036c798913867accf6a64ae10725c5062a68c200ecd9df98570c45 not found: ID does not exist" Apr 16 17:13:47.728912 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.728843 2573 scope.go:117] "RemoveContainer" containerID="ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c" Apr 16 17:13:47.729098 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:47.729081 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c\": container with ID starting with ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c not found: ID does not exist" containerID="ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c" Apr 16 17:13:47.729161 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.729104 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c"} err="failed to get container status \"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c\": rpc error: code = NotFound desc = could not find container \"ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c\": container with ID starting with ce145a9bd1df3860ebc67c89227cc75a4059a9a9a044dd2a7994df5518ad234c not found: ID does not exist" Apr 16 17:13:47.729161 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.729117 2573 scope.go:117] "RemoveContainer" containerID="8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c" Apr 16 17:13:47.729364 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:47.729348 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c\": container with ID starting with 8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c not found: ID does not exist" containerID="8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c" Apr 16 17:13:47.729433 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:47.729366 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c"} err="failed to get container status \"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c\": rpc error: code = NotFound desc = could not find container \"8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c\": container with ID starting with 8db1daea2e7811f4ddc64ce066a7ff93cc698d45b3a9382a890321c611999f3c not found: ID does not exist" Apr 16 17:13:48.077660 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.077625 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schebsshf" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.41:9003\" within 1s: context deadline exceeded" Apr 16 17:13:48.681165 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.681145 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:13:48.710417 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.710368 2573 generic.go:358] "Generic (PLEG): container finished" podID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerID="8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7" exitCode=0 Apr 16 17:13:48.710537 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.710423 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerDied","Data":"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7"} Apr 16 17:13:48.710537 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.710462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"9974a58f-765a-41bc-bebc-aafb5c7e387a","Type":"ContainerDied","Data":"19753a648364d2538758b361f601cb0bad1a4d29f8f0c2582ef1e52ef8204885"} Apr 16 17:13:48.710537 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.710480 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 17:13:48.710668 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.710487 2573 scope.go:117] "RemoveContainer" containerID="8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7" Apr 16 17:13:48.732871 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.732840 2573 scope.go:117] "RemoveContainer" containerID="47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d" Apr 16 17:13:48.752164 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752281 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752182 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752281 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752249 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752290 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752322 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgp9p\" (UniqueName: \"kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752355 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm\") pod \"9974a58f-765a-41bc-bebc-aafb5c7e387a\" (UID: \"9974a58f-765a-41bc-bebc-aafb5c7e387a\") " Apr 16 17:13:48.752709 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752686 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home" (OuterVolumeSpecName: "home") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:48.752939 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.752916 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache" (OuterVolumeSpecName: "model-cache") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:48.754628 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.754584 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm" (OuterVolumeSpecName: "dshm") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:48.754728 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.754629 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p" (OuterVolumeSpecName: "kube-api-access-mgp9p") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "kube-api-access-mgp9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:13:48.754924 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.754907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:13:48.811478 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.811455 2573 scope.go:117] "RemoveContainer" containerID="8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7" Apr 16 17:13:48.811800 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:48.811782 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7\": container with ID starting with 8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7 not found: ID does not exist" containerID="8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7" Apr 16 17:13:48.811882 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.811811 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7"} err="failed to get container status \"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7\": rpc error: code = NotFound desc = could not find container \"8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7\": container with ID starting with 8ea973c7cc06bd2b10794cf342e124fb115926cde55586c7ec2393c39c48c9f7 not found: ID does not exist" Apr 16 17:13:48.811882 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.811840 2573 scope.go:117] "RemoveContainer" containerID="47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d" Apr 16 17:13:48.812109 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:13:48.812078 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d\": container with ID starting with 47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d not found: ID does not exist" containerID="47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d" Apr 16 17:13:48.812176 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.812109 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d"} err="failed to get container status \"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d\": rpc error: code = NotFound desc = could not find container \"47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d\": container with ID starting with 47e7875ece99474fdabee1eb6d5d619c5a7b2dd8b09b28e28f37ca5e705d821d not found: ID does not exist" Apr 16 17:13:48.822116 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.822088 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9974a58f-765a-41bc-bebc-aafb5c7e387a" (UID: "9974a58f-765a-41bc-bebc-aafb5c7e387a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:13:48.853714 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853689 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-model-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:48.853714 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853714 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9974a58f-765a-41bc-bebc-aafb5c7e387a-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:48.853833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853725 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgp9p\" (UniqueName: \"kubernetes.io/projected/9974a58f-765a-41bc-bebc-aafb5c7e387a-kube-api-access-mgp9p\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:48.853833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853734 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-dshm\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:48.853833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853743 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:48.853833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:48.853752 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9974a58f-765a-41bc-bebc-aafb5c7e387a-home\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:13:49.035101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:49.035073 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:13:49.038810 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:49.038789 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 17:13:49.420823 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:49.420796 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" path="/var/lib/kubelet/pods/5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2/volumes" Apr 16 17:13:49.421254 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:49.421241 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" path="/var/lib/kubelet/pods/9974a58f-765a-41bc-bebc-aafb5c7e387a/volumes" Apr 16 17:13:53.226321 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:53.226275 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:13:53.239275 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:13:53.239248 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:14:03.226239 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:03.226198 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:14:05.696005 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:05.695976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:14:13.225930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:13.225881 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:14:23.226894 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:23.226803 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:14:33.226099 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:33.226047 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:14:43.226118 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:43.226013 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:14:53.226791 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:14:53.226741 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:15:03.226347 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:03.226296 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 16 17:15:13.236479 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:13.236439 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:15:13.252359 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:13.252331 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:15:24.405170 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.405118 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:15:24.405756 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.405645 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="main" containerID="cri-o://ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df" gracePeriod=30 Apr 16 17:15:24.405756 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.405686 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="tokenizer" containerID="cri-o://2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07" gracePeriod=30 Apr 16 17:15:24.407082 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.407053 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:15:24.407491 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.407445 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" containerID="cri-o://c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" gracePeriod=30 Apr 16 17:15:24.692567 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:24.692483 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.43:8082/healthz\": dial tcp 10.132.0.43:8082: connect: connection refused" Apr 16 17:15:25.037647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.037613 2573 generic.go:358] "Generic (PLEG): container finished" podID="42d0a094-07eb-454d-a708-b1897293a5a4" containerID="ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df" exitCode=0 Apr 16 17:15:25.037647 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.037640 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerDied","Data":"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df"} Apr 16 17:15:25.560401 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.560359 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:15:25.638110 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638048 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638241 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638120 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638241 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638139 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638241 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638159 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwn87\" (UniqueName: \"kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638241 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638181 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638480 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638306 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:25.638480 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638321 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location\") pod \"42d0a094-07eb-454d-a708-b1897293a5a4\" (UID: \"42d0a094-07eb-454d-a708-b1897293a5a4\") " Apr 16 17:15:25.638480 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638423 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:25.638644 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638521 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:25.638687 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638669 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:25.638724 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638688 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-uds\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:25.638724 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638704 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-tokenizer-tmp\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:25.639008 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.638989 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:25.640411 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.640362 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:15:25.640512 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.640411 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87" (OuterVolumeSpecName: "kube-api-access-vwn87") pod "42d0a094-07eb-454d-a708-b1897293a5a4" (UID: "42d0a094-07eb-454d-a708-b1897293a5a4"). InnerVolumeSpecName "kube-api-access-vwn87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:15:25.739427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.739407 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d0a094-07eb-454d-a708-b1897293a5a4-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:25.739427 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.739426 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwn87\" (UniqueName: \"kubernetes.io/projected/42d0a094-07eb-454d-a708-b1897293a5a4-kube-api-access-vwn87\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:25.739551 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:25.739437 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42d0a094-07eb-454d-a708-b1897293a5a4-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:26.043761 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.043723 2573 generic.go:358] "Generic (PLEG): container finished" podID="42d0a094-07eb-454d-a708-b1897293a5a4" containerID="2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07" exitCode=0 Apr 16 17:15:26.043881 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.043823 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" Apr 16 17:15:26.043929 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.043815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerDied","Data":"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07"} Apr 16 17:15:26.043967 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.043939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk" event={"ID":"42d0a094-07eb-454d-a708-b1897293a5a4","Type":"ContainerDied","Data":"7d5f42cb7f3f193a98f8820c8971473880f1fbbbfc4120d574ea2586833471f3"} Apr 16 17:15:26.043967 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.043960 2573 scope.go:117] "RemoveContainer" containerID="2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07" Apr 16 17:15:26.053838 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.053821 2573 scope.go:117] "RemoveContainer" containerID="ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df" Apr 16 17:15:26.061912 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.061892 2573 scope.go:117] "RemoveContainer" containerID="c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364" Apr 16 17:15:26.068218 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.068190 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:15:26.070697 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.070670 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8546c8d7447ksk"] Apr 16 17:15:26.071605 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.071589 2573 scope.go:117] "RemoveContainer" containerID="2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07" Apr 16 17:15:26.071901 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:26.071876 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07\": container with ID starting with 2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07 not found: ID does not exist" containerID="2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07" Apr 16 17:15:26.071986 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.071914 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07"} err="failed to get container status \"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07\": rpc error: code = NotFound desc = could not find container \"2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07\": container with ID starting with 2591a72ffc3116d8c11ce36a22f64a8d8a1f9db0c5e0973985e140bc43b9ee07 not found: ID does not exist" Apr 16 17:15:26.071986 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.071942 2573 scope.go:117] "RemoveContainer" containerID="ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df" Apr 16 17:15:26.072224 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:26.072205 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df\": container with ID starting with ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df not found: ID does not exist" containerID="ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df" Apr 16 17:15:26.072266 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.072230 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df"} err="failed to get container status \"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df\": rpc error: code = NotFound desc = could not find container \"ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df\": container with ID starting with ec4fcbdb33c29aa1eefa09f4ae909644c11bb4bad619d0b1731fdd1afe5232df not found: ID does not exist" Apr 16 17:15:26.072266 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.072246 2573 scope.go:117] "RemoveContainer" containerID="c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364" Apr 16 17:15:26.072574 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:26.072553 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364\": container with ID starting with c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364 not found: ID does not exist" containerID="c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364" Apr 16 17:15:26.072646 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:26.072586 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364"} err="failed to get container status \"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364\": rpc error: code = NotFound desc = could not find container \"c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364\": container with ID starting with c8688e676e93a0c9fc9aa90c18c1ab114609b8425afe69ea8861dbcfdc176364 not found: ID does not exist" Apr 16 17:15:27.421461 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:27.421426 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" path="/var/lib/kubelet/pods/42d0a094-07eb-454d-a708-b1897293a5a4/volumes" Apr 16 17:15:39.554332 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:39.554297 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:39.625024 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:39.624997 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:39.631016 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:39.630995 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:39.640153 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:39.640135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:40.599936 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:40.599904 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:40.642589 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:40.642559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:40.650300 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:40.650278 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:40.659333 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:40.659311 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:41.615947 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:41.615013 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:41.656486 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:41.656426 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:41.662942 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:41.662916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:41.671839 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:41.671820 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:42.598912 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:42.598882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:42.638268 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:42.638243 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:42.645039 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:42.645018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:42.654572 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:42.654546 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:43.574518 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:43.574481 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:43.611271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:43.611248 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:43.617073 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:43.617053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:43.626146 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:43.626117 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:44.536240 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:44.536209 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:44.575764 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:44.575732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:44.581994 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:44.581966 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:44.590654 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:44.590636 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:45.529449 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:45.529406 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:45.569363 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:45.569326 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:45.577550 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:45.577530 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:45.586968 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:45.586941 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:46.508704 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:46.508675 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:46.547199 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:46.547177 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:46.556704 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:46.556686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:46.566204 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:46.566185 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:47.539101 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:47.539065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:47.577826 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:47.577803 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:47.586552 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:47.586532 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:47.595864 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:47.595840 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:48.558360 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:48.558334 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:48.597470 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:48.597438 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:48.617353 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:48.617328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:48.627355 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:48.627331 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:49.549551 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:49.549519 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:49.588671 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:49.588632 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:49.595073 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:49.595052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:49.604643 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:49.604620 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:50.535714 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:50.535681 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:50.573868 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:50.573839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:50.580759 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:50.580737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:50.590456 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:50.590430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:51.520089 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:51.520062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:51.557722 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:51.557692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:51.563997 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:51.563969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:51.573110 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:51.573092 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:52.498465 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:52.498435 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l24v6_60778c40-92ca-4f0b-9544-83d16c23c3a9/istio-proxy/0.log" Apr 16 17:15:52.537614 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:52.537584 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:52.544516 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:52.544493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/llm-d-routing-sidecar/0.log" Apr 16 17:15:52.555301 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:52.555259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/storage-initializer/0.log" Apr 16 17:15:53.533899 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:53.533844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5887855b54-djpw5_682aafdb-c596-4a7b-8112-c6c867ff770e/router/0.log" Apr 16 17:15:54.350783 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.350754 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5887855b54-djpw5_682aafdb-c596-4a7b-8112-c6c867ff770e/router/0.log" Apr 16 17:15:54.407506 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.407452 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="llm-d-routing-sidecar" containerID="cri-o://ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" gracePeriod=2 Apr 16 17:15:54.674271 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.674247 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:54.674930 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.674912 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:15:54.750356 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750326 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750529 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750421 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750529 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750456 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750529 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750479 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750529 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750506 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfzcd\" (UniqueName: \"kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750755 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750532 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs\") pod \"e5bdb323-afb3-490d-a32e-0f2b04579c86\" (UID: \"e5bdb323-afb3-490d-a32e-0f2b04579c86\") " Apr 16 17:15:54.750755 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750694 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache" (OuterVolumeSpecName: "model-cache") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.750920 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750756 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-model-cache\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.750920 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.750881 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home" (OuterVolumeSpecName: "home") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.752794 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.752761 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm" (OuterVolumeSpecName: "dshm") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.752794 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.752784 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:15:54.752947 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.752902 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd" (OuterVolumeSpecName: "kube-api-access-pfzcd") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "kube-api-access-pfzcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:15:54.807187 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.807150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5bdb323-afb3-490d-a32e-0f2b04579c86" (UID: "e5bdb323-afb3-490d-a32e-0f2b04579c86"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:15:54.851102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.851074 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-dshm\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.851102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.851098 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-home\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.851260 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.851110 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pfzcd\" (UniqueName: \"kubernetes.io/projected/e5bdb323-afb3-490d-a32e-0f2b04579c86-kube-api-access-pfzcd\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.851260 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.851123 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5bdb323-afb3-490d-a32e-0f2b04579c86-tls-certs\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:54.851260 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:54.851137 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5bdb323-afb3-490d-a32e-0f2b04579c86-kserve-provision-location\") on node \"ip-10-0-138-58.ec2.internal\" DevicePath \"\"" Apr 16 17:15:55.148942 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.148902 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-5ttpp_b3993f29-38ac-41b3-aed8-7010c1a5b79a/authorino/0.log" Apr 16 17:15:55.150812 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.150791 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz_e5bdb323-afb3-490d-a32e-0f2b04579c86/main/0.log" Apr 16 17:15:55.151356 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151338 2573 generic.go:358] "Generic (PLEG): container finished" podID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerID="c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" exitCode=137 Apr 16 17:15:55.151356 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151355 2573 generic.go:358] "Generic (PLEG): container finished" podID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerID="ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" exitCode=0 Apr 16 17:15:55.151492 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerDied","Data":"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f"} Apr 16 17:15:55.151492 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151428 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" Apr 16 17:15:55.151492 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerDied","Data":"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51"} Apr 16 17:15:55.151492 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151466 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz" event={"ID":"e5bdb323-afb3-490d-a32e-0f2b04579c86","Type":"ContainerDied","Data":"fcf93a3ef39446f8053c5b0ff4da7b20595698cb89fc5363aa10284fcab6f1e9"} Apr 16 17:15:55.151492 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.151481 2573 scope.go:117] "RemoveContainer" containerID="c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" Apr 16 17:15:55.170944 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.170926 2573 scope.go:117] "RemoveContainer" containerID="4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5" Apr 16 17:15:55.174346 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.174323 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:15:55.178823 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.178803 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f8ccb594c-zf7sz"] Apr 16 17:15:55.190371 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.190355 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-d76cw_169c36ca-e161-40b0-9d76-166f4626fa3e/kuadrant-console-plugin/0.log" Apr 16 17:15:55.237981 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.237961 2573 scope.go:117] "RemoveContainer" containerID="ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" Apr 16 17:15:55.245723 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.245699 2573 scope.go:117] "RemoveContainer" containerID="c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" Apr 16 17:15:55.246026 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:55.246007 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f\": container with ID starting with c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f not found: ID does not exist" containerID="c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" Apr 16 17:15:55.246102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246038 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f"} err="failed to get container status \"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f\": rpc error: code = NotFound desc = could not find container \"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f\": container with ID starting with c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f not found: ID does not exist" Apr 16 17:15:55.246102 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246057 2573 scope.go:117] "RemoveContainer" containerID="4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5" Apr 16 17:15:55.246295 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:55.246277 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5\": container with ID starting with 4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5 not found: ID does not exist" containerID="4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5" Apr 16 17:15:55.246365 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246304 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5"} err="failed to get container status \"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5\": rpc error: code = NotFound desc = could not find container \"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5\": container with ID starting with 4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5 not found: ID does not exist" Apr 16 17:15:55.246365 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246325 2573 scope.go:117] "RemoveContainer" containerID="ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" Apr 16 17:15:55.246627 ip-10-0-138-58 kubenswrapper[2573]: E0416 17:15:55.246607 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51\": container with ID starting with ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51 not found: ID does not exist" containerID="ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" Apr 16 17:15:55.246705 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246632 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51"} err="failed to get container status \"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51\": rpc error: code = NotFound desc = could not find container \"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51\": container with ID starting with ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51 not found: ID does not exist" Apr 16 17:15:55.246705 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246653 2573 scope.go:117] "RemoveContainer" containerID="c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f" Apr 16 17:15:55.246918 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246897 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f"} err="failed to get container status \"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f\": rpc error: code = NotFound desc = could not find container \"c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f\": container with ID starting with c3cc97fe575db8194d1ce1fe2a6255271c503b70efd1bb974febdedf094a687f not found: ID does not exist" Apr 16 17:15:55.246999 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.246919 2573 scope.go:117] "RemoveContainer" containerID="4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5" Apr 16 17:15:55.247159 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.247140 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5"} err="failed to get container status \"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5\": rpc error: code = NotFound desc = could not find container \"4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5\": container with ID starting with 4f022acdbd97b587cf002faeb248262fbcf4937703004c88eacc53c00bd2bce5 not found: ID does not exist" Apr 16 17:15:55.247212 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.247168 2573 scope.go:117] "RemoveContainer" containerID="ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51" Apr 16 17:15:55.247412 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.247394 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51"} err="failed to get container status \"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51\": rpc error: code = NotFound desc = could not find container \"ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51\": container with ID starting with ae173c29f450cfa0dd2e96b79b5b75be34aa612f9f707494ef178726cd3c8e51 not found: ID does not exist" Apr 16 17:15:55.421034 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:15:55.420969 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" path="/var/lib/kubelet/pods/e5bdb323-afb3-490d-a32e-0f2b04579c86/volumes" Apr 16 17:16:00.376610 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:00.376577 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4zxj2_59cc8b51-5a0b-45ea-8e53-f1473e78b939/global-pull-secret-syncer/0.log" Apr 16 17:16:00.513876 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:00.513852 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ms2xr_1950f42e-3894-4436-a1c6-d5e65379ba61/konnectivity-agent/0.log" Apr 16 17:16:00.561057 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:00.561039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-58.ec2.internal_43c81a0a9a01b0c05b287312dc013cbc/haproxy/0.log" Apr 16 17:16:04.582261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:04.582227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-5ttpp_b3993f29-38ac-41b3-aed8-7010c1a5b79a/authorino/0.log" Apr 16 17:16:04.663833 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:04.663809 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-d76cw_169c36ca-e161-40b0-9d76-166f4626fa3e/kuadrant-console-plugin/0.log" Apr 16 17:16:05.830650 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:05.830620 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-xn576_7bc7dff0-16af-4031-a829-4427a2699284/cluster-monitoring-operator/0.log" Apr 16 17:16:06.103176 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:06.103093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wbt4v_1fc72970-3e8e-4014-9362-eadf182a5df0/node-exporter/0.log" Apr 16 17:16:06.121312 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:06.121287 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wbt4v_1fc72970-3e8e-4014-9362-eadf182a5df0/kube-rbac-proxy/0.log" Apr 16 17:16:06.139567 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:06.139546 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wbt4v_1fc72970-3e8e-4014-9362-eadf182a5df0/init-textfile/0.log" Apr 16 17:16:07.786979 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:07.786954 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-9v9hx_363c07d0-bf5c-4368-a3fe-6d5136c2cd22/networking-console-plugin/0.log" Apr 16 17:16:08.313867 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:08.313840 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/1.log" Apr 16 17:16:08.322674 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:08.322643 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jgsfb_6c99cce0-b27a-481f-8825-9d205581b7d0/console-operator/2.log" Apr 16 17:16:09.225349 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225323 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l"] Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225631 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225644 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225651 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="main" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225657 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="main" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225668 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="tokenizer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225674 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="tokenizer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225681 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225687 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225694 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225699 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="storage-initializer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225705 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225710 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225717 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="tokenizer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225723 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="tokenizer" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225738 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="llm-d-routing-sidecar" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225744 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="llm-d-routing-sidecar" Apr 16 17:16:09.225741 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225752 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="storage-initializer" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225758 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="storage-initializer" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225765 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225770 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225780 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225842 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="tokenizer" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225851 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="42d0a094-07eb-454d-a708-b1897293a5a4" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225858 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="tokenizer" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225864 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="llm-d-routing-sidecar" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225870 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9974a58f-765a-41bc-bebc-aafb5c7e387a" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225877 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5535f7d5-c91c-45d4-bd9d-4a6e3fe3a8f2" containerName="main" Apr 16 17:16:09.226269 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.225884 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5bdb323-afb3-490d-a32e-0f2b04579c86" containerName="main" Apr 16 17:16:09.231227 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.231206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.234148 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.234126 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-msdj2\"/\"default-dockercfg-4rjqv\"" Apr 16 17:16:09.234258 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.234177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"kube-root-ca.crt\"" Apr 16 17:16:09.234258 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.234219 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"openshift-service-ca.crt\"" Apr 16 17:16:09.234411 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.234297 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l"] Apr 16 17:16:09.249211 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.249190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq44r\" (UniqueName: \"kubernetes.io/projected/a56fabf1-625a-4aaa-91fa-facbe1bca9af-kube-api-access-kq44r\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.249328 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.249233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-sys\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.249328 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.249252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-lib-modules\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.249328 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.249279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-proc\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.249500 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.249342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-podres\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.269428 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.269401 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-2gpsx_96edca93-47fe-432e-86c9-b734c62b1424/volume-data-source-validator/0.log" Apr 16 17:16:09.350022 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-sys\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-lib-modules\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-proc\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350104 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-podres\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq44r\" (UniqueName: \"kubernetes.io/projected/a56fabf1-625a-4aaa-91fa-facbe1bca9af-kube-api-access-kq44r\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-sys\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-proc\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-lib-modules\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.350261 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.350208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a56fabf1-625a-4aaa-91fa-facbe1bca9af-podres\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.358011 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.357992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq44r\" (UniqueName: \"kubernetes.io/projected/a56fabf1-625a-4aaa-91fa-facbe1bca9af-kube-api-access-kq44r\") pod \"perf-node-gather-daemonset-sgb8l\" (UID: \"a56fabf1-625a-4aaa-91fa-facbe1bca9af\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.542535 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.542513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:09.659202 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.659169 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l"] Apr 16 17:16:09.662009 ip-10-0-138-58 kubenswrapper[2573]: W0416 17:16:09.661983 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda56fabf1_625a_4aaa_91fa_facbe1bca9af.slice/crio-7fcf87e6d452c6c2a1a18d7f4c946ed7ac07154a1e54775abfb66de118379090 WatchSource:0}: Error finding container 7fcf87e6d452c6c2a1a18d7f4c946ed7ac07154a1e54775abfb66de118379090: Status 404 returned error can't find the container with id 7fcf87e6d452c6c2a1a18d7f4c946ed7ac07154a1e54775abfb66de118379090 Apr 16 17:16:09.663506 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:09.663491 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:16:10.080587 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.080562 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lhgcr_db86a360-38b7-4c87-ac77-176127220106/dns/0.log" Apr 16 17:16:10.098298 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.098277 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lhgcr_db86a360-38b7-4c87-ac77-176127220106/kube-rbac-proxy/0.log" Apr 16 17:16:10.138514 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.138495 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c8bgf_38487532-b4be-41ef-a345-cea3cc5a643c/dns-node-resolver/0.log" Apr 16 17:16:10.197648 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.197624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" event={"ID":"a56fabf1-625a-4aaa-91fa-facbe1bca9af","Type":"ContainerStarted","Data":"dfd8713a40740215fd66b2c861acf7d0d926be1ad5c60048b4810b9ce599624a"} Apr 16 17:16:10.197758 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.197652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" event={"ID":"a56fabf1-625a-4aaa-91fa-facbe1bca9af","Type":"ContainerStarted","Data":"7fcf87e6d452c6c2a1a18d7f4c946ed7ac07154a1e54775abfb66de118379090"} Apr 16 17:16:10.197805 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.197758 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:10.211523 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.211482 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" podStartSLOduration=1.211470709 podStartE2EDuration="1.211470709s" podCreationTimestamp="2026-04-16 17:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:16:10.211095034 +0000 UTC m=+1695.351813776" watchObservedRunningTime="2026-04-16 17:16:10.211470709 +0000 UTC m=+1695.352189452" Apr 16 17:16:10.619498 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:10.619473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-49l6w_a3cab79a-4ea1-4744-b205-fd85c929391f/node-ca/0.log" Apr 16 17:16:11.477981 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:11.477928 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5887855b54-djpw5_682aafdb-c596-4a7b-8112-c6c867ff770e/router/0.log" Apr 16 17:16:11.960876 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:11.960847 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zt96l_40e02308-a3e4-43c3-8e6d-b59cfe039143/serve-healthcheck-canary/0.log" Apr 16 17:16:12.372486 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:12.372460 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-x9csm_ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2/insights-operator/0.log" Apr 16 17:16:12.373191 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:12.373171 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-x9csm_ebea9173-b1aa-4ef6-a1bb-6b7483b70fd2/insights-operator/1.log" Apr 16 17:16:12.459599 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:12.459583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6bck_861f1460-ddcc-410d-a721-789653443c7b/kube-rbac-proxy/0.log" Apr 16 17:16:12.481875 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:12.481858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6bck_861f1460-ddcc-410d-a721-789653443c7b/exporter/0.log" Apr 16 17:16:12.501048 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:12.501027 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6bck_861f1460-ddcc-410d-a721-789653443c7b/extractor/0.log" Apr 16 17:16:14.977351 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:14.977314 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-65f5d85b79-ppb2c_9f5c241d-078c-45ed-a269-efb2dd6acc38/manager/0.log" Apr 16 17:16:16.211126 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:16.211095 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-sgb8l" Apr 16 17:16:20.660987 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:20.660870 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-7w64z_4b32d91f-2c9d-4d71-b910-066e212015e3/kube-storage-version-migrator-operator/1.log" Apr 16 17:16:20.662526 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:20.662503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-7w64z_4b32d91f-2c9d-4d71-b910-066e212015e3/kube-storage-version-migrator-operator/0.log" Apr 16 17:16:21.611418 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:21.611367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-77mp4_496bd28d-40d9-43b2-91c6-462df146eecc/kube-multus/0.log" Apr 16 17:16:21.974147 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:21.974077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/kube-multus-additional-cni-plugins/0.log" Apr 16 17:16:21.992398 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:21.992356 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/egress-router-binary-copy/0.log" Apr 16 17:16:22.010314 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.010294 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/cni-plugins/0.log" Apr 16 17:16:22.029491 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.029472 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/bond-cni-plugin/0.log" Apr 16 17:16:22.047320 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.047298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/routeoverride-cni/0.log" Apr 16 17:16:22.065845 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.065826 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/whereabouts-cni-bincopy/0.log" Apr 16 17:16:22.084186 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.084163 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t7jrv_50367a6c-7164-45c2-b2f1-af3375aa5768/whereabouts-cni/0.log" Apr 16 17:16:22.187726 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.187708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6gbd_f7e26d85-638f-42c1-9b32-67320a5cbbe3/network-metrics-daemon/0.log" Apr 16 17:16:22.203860 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:22.203843 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6gbd_f7e26d85-638f-42c1-9b32-67320a5cbbe3/kube-rbac-proxy/0.log" Apr 16 17:16:23.528319 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.528291 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/ovn-controller/0.log" Apr 16 17:16:23.557838 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.557815 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/ovn-acl-logging/0.log" Apr 16 17:16:23.579832 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.579806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/kube-rbac-proxy-node/0.log" Apr 16 17:16:23.598737 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.598717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:16:23.615883 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.615860 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/northd/0.log" Apr 16 17:16:23.635803 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.635782 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/nbdb/0.log" Apr 16 17:16:23.655059 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.655020 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/sbdb/0.log" Apr 16 17:16:23.825055 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:23.825030 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vts2x_cc2084a8-5ef5-4dde-bae7-f84589b59b40/ovnkube-controller/0.log" Apr 16 17:16:24.882097 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:24.882073 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-2sdvm_533755f5-7620-43ef-aa9a-be97a74e8866/check-endpoints/0.log" Apr 16 17:16:24.927752 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:24.927733 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-md7k7_41dd0be2-82c4-4469-b8d9-d1b98a4adb55/network-check-target-container/0.log" Apr 16 17:16:25.899679 ip-10-0-138-58 kubenswrapper[2573]: I0416 17:16:25.899645 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8ghz4_06481c75-0539-4928-baa6-a9ed683f7054/iptables-alerter/0.log"